Raising the Standard of Prison Education for Inmates
August 27, 2015
By Christopher Zoukis
There are few studies of high-quality research design on the effects of prison education and some have no scientific merit. Research quality needs to improve so policymakers can use study results to make decisions. Researchers can improve study design by paying attention to specific areas.
A Strategy for Improving Correctional Education Study Design
With 15% or less of studies designed well, research standards into the effects of correctional education needs improvement (Davis, Bozick, Steele, Saunders, & Miles, 2013). Here are some ideas for improvements.
Pick a Clear, Defined Question to Answer
Several general studies have proven educating prisoners reduces recidivism, but many questions remain unanswered because of a lack of detail in most prior studies.
Many studies investigated reduced recidivism associated with GED, college-level and vocational programs, but few directly compared them. While RAND Corporation’s meta-analysis calculated recidivism rates for all three, these figures couldn’t be used to make comparisons (Davis el al., 2013).
There is also limited data on how the length of an educational program affects recidivism, employment, and wages. Research questions should target policymaker needs to make decision-making easier and improve the supply of correctional education.
Do Not Be Too Ambitious
It is better to answer one or two questions clearly than to get inconclusive answers to several. Trying to answer too many questions can lead to misleading results.
Accepted results typically have a calculated p-value of less than 0.05, meaning the likelihood the finding occurred by chance and is untrue is less than 5%. That 5% chance applies to each comparison. If a study makes 10 comparisons, there would be a 50% chance one of the findings will be a false result. A statistically aware reader would not be sure which of the results to believe.
Clearly Define Endpoints
More than 20 different definitions of recidivism have been used in studies. Researchers should agree to a definition so that study results can be compared. Agreements on definitions help avoid inconsistency later.
The Bureau of Justice Statistics has a reasonable definition of recidivism: A criminal act that results in re-arrest, re-conviction, or return to prison within three years of release (Harlow, 2003).
This definition doesn’t count returning to prison for technical violations like returning late to a halfway house. However, not all arrests result in conviction, and not all convictions result in return to prison. Keeping the three separate is important for cost-effectiveness calculations, and should be the rule for most studies.
Prospective Rather Than Retrospective Studies Where Possible
Prospective studies take longer than retrospective studies to yield results, but the results are generally more reliable. Retrospective studies examine historical data and are dependent on the quality and completeness of records not collected for that specific study.
Randomize Whenever Possible
Selection bias is a major factor in correctional education studies. Certain types of inmates volunteer for education programs and may not represent all inmates. Randomization, where researchers randomly assign inmates to a program or none at all, avoids this issue.
Randomization is not always possible. Researchers can’t assign inmates to programs they’re unwilling to take, but it can still be used. An example might be to compare the effect of supplementing a training program with resume writing and job interview training.
Use Appropriate Comparison Groups
The two groups being compared should be as similar as possible except for the intervention under investigation. Using a group with no GED as the control for a group taking college courses is likely to overestimate the drop in recidivism for inmate college students.
Use Sample Size Calculations
Many studies are conducted using whatever inmates or records are available, but may fail because there isn’t the proper amount of study subjects.
Online tools are available to help determine the number of subjects needed. They require estimates of expected differences between groups and knowledge of statistics, so it’s best to consult a statistician.
Control for Selection Bias
The best way to reduce selection bias is through randomization. When this is not possible, collect as much information as possible on pre-existing characteristics, which may affect the outcome independently of the education. A list of such characteristics would include:
- Measures of social support
- Prior educational level
- Prior criminal history
- Literacy/numeracy level
- Attitudes towards others
- Offense type
- Offense severity
- Sentence length
- History of substance abuse
- Marital statusPrior occupation
- Socioeconomic status
- Presence of mental health problems
Consult Experts and Conduct a Feasibility Survey
Getting the optimal study design usually requires input from:
- Subject area experts.
- Policymakers who might rely on the results for decision-making.
- Statisticians (who should ideally be part of the research team).
- Individuals familiar with operations of prisons and parole offices.
Gathering everyone for a day to review and troubleshoot a draft study protocol is a great way to achieve this.
Once the study protocol is almost finished, researchers should hold consultations with front-line workers to ensure the plan is feasible and to get a sense of potential problems.
Finalize the Statistical Analysis Plan Before Seeing Any Data
The best studies are designed to answer a clearly-defined question. The worst are fishing expeditions where some data is collected, interim analyses are performed, and study objectives and procedures are changed or added on the fly.
Before researchers see any data, they should agree on how the data will be analyzed and use dummy data ahead of time for testing. The plan should include specifications on how to deal with missing data.
Statistical analysis should contain appropriate statistics to convey the result’s accuracy and reliability, such as p-values and 95% confidence intervals. Researchers can use methods of controlling for potential variables for predictable outcomes. These include:
- Fixed effects panel models
- Propensity score matching
- Multivariate regression analysis
- Instrumental variables
- Hickman methods
- Sensitivity analyses
Ideally, the statistician shouldn’t know which group an individual belongs to in order to avoid unintentional bias.
Publishing and Disseminating Results
Providing understandable and reliable results is the key to credibility. It helps to advance prison education and maintain good relationships to facilitate more research. Three aspects of this process are worth mentioning.
- Include an honest discussion of any shortcomings, limitations, potential sources of error, and unanswered questions in the final report.
- Seek opportunities to present the results to corrections staff and inmates involved in the study. This involvement is always appreciated, fosters more interest and motivation, and offers an opportunity for feedback that can improve future studies.
- Make every effort to publish negative results, even if just in a letter to a journal. Negative results are often just as important as positive ones.
Finding a Better, Smarter Way
The intention is to stimulate thought and discussion to help in the design of stronger correctional education studies. Readers should refer to appropriate textbooks and seek expert advice for more guidance. Including an experienced statistician on the team from the beginning is invaluable.
Davis, L. M., Bozick, R., Steele, J., Saunders, J., & Miles, J. N. (2013). Evaluating the effectiveness of correctional education - A meta-analysis of programs that provide education to incarcerated adults. RAND Corporation.
Gaes, G. G. (2008, February 18). The impact of prison education programs on post-release outcomes. Paper presented at the Re-Entry Roundtable on Education, John Jay College of Criminal Justice.
Harlow, C. W. (2003). Education and correctional populations. Bureau of Justice Statistics Special Report, U.S. Department of Justice. NCJ 195670.