Back to Journals » Neuropsychiatric Disease and Treatment » Volume 14

Clinician-delivered cognitive training for children with attention problems: effects on cognition and behavior from the ThinkRx randomized controlled trial

Authors Moore AL , Carpenter II DM, Miller TM , Ledbetter C

Received 13 February 2018

Accepted for publication 24 April 2018

Published 26 June 2018 Volume 2018:14 Pages 1671—1683

DOI https://doi.org/10.2147/NDT.S165418

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 4

Editor who approved publication: Dr Roger Pinder



Video abstract presented by Amy Lawson Moore.

Views: 13535

 

Amy Lawson Moore,1 Dick M Carpenter II,2 Terissa M Miller,1 Christina Ledbetter3

1Gibson Institute of Cognitive Research, Colorado Springs, CO, USA; 2College of Education, University of Colorado, Colorado Springs, CO, USA; 3Department of Neurosurgery, Louisiana State University Health Sciences Center, Shreveport, LA, USA

Purpose: The impact of attention problems on academic and social functioning coupled with the large number of children failing to respond to stimulant medication or behavioral therapy makes adjunctive therapies such as cognitive training appealing for families and clinicians of children with attention difficulties or childhood attention deficit hyperactivity disorder. However, the results of cognitive training studies have failed to find far transfer effects with this population. This study examined the quantitative cognitive effects and parent-reported behavioral effects of a clinician-delivered cognitive training program with children who have attention problems.
Patients and methods: Using a randomized controlled study design, we examined the impact of a clinician-delivered cognitive training program on processing speed, fluid reasoning, memory, visual processing, auditory processing, attention, overall intelligence quotient score, and behavior of students (n=13) aged 8–14 years with attention problems. Participants were randomly assigned to either a waitlist control group or a treatment group for 60 hours of cognitive training with ThinkRx, a clinician-delivered intervention that targets multiple cognitive skills with game-like, but rigorous mental tasks in 60–90-minute training sessions at least 3 days per week.
Results: Results included greater mean pretest to posttest change scores on all variables for the treatment group versus the control group with statistically significant differences noted in working memory, long-term memory, logic and reasoning, auditory processing, and intelligence quotient score. Qualitative outcomes included parent-reported changes in confidence, cooperation, and self-discipline.
Conclusion: Children with attention problems who completed 60 hours of clinician-delivered ThinkRx cognitive training realized both cognitive and behavioral improvements.


Keywords: brain training, cognitive rehabilitation, ADHD, LearningRx, cognitive training

Introduction

Attention skills are the number one predictor of academic performance, and attention problems in childhood reduce the probability of graduating from high school by 40%.1 As a neurodevelopmental condition, attention deficit hyperactivity disorder (ADHD) is defined by a persistent pattern of inattention and/or hyperactivity–impulsivity that interferes with functioning and development.2 A primary characteristic of ADHD is a deficit in executive function skills, which manifests as impaired ability to coordinate cognitive processes that include focus, effort, memory, emotional response, activation, and action.3,4 These deficits impact the individual’s skill in managing time, organizing everyday tasks, regulating emotions and frustration tolerance, maintaining focus, managing behavior, and even remembering important information. Extant research shows specific patterns of executive functioning deficits in people with ADHD,5 revealing lower test scores on executive processing, attention, and cognitive fluency,6 as well as working memory7 and processing speed.8

Barkley et al9 report that >90% of people diagnosed with ADHD are identified by age 12, which then impacts their school performance and social relationships throughout the life span. ADHD is also associated with dropping out of school, criminal behavior and incarceration, teenage pregnancy, substance abuse, accidental injury, and automobile accidents.10 However, there is a nonclinical population of children who are not definitively diagnosed with ADHD, but for whom hyperactivity, impulsivity, and attention problems create similar academic, social, and emotional struggles as for their ADHD peers. Parents of these children report similar frustrations, lack of control, and need for interventions as do parents of children with diagnosed ADHD.11 Further, children with subclinical attention problems struggle not only in the classroom but also in supplemental educational service interventions such as tutoring.12 Therefore, interventions should necessarily target children with ADHD and subclinical attention deficit problems with equal importance.

Standard of care for children with ADHD is stimulant medication and behavior therapy.13 Although there are a widespread use of stimulants and research support for behavior modification as an effective intervention for ADHD,14 a large number of children do not respond fully to stimulant medication or to behavioral therapy treatments.15 The impact of attention problems and ADHD on academic and social functioning thus remains a concern for parents, and adjunctive or alternative therapies have gained appeal for families and clinicians of children with attention struggles.16

Cognitive training is one such therapy. Cognitive training is an intervention that targets the remediation of cognitive skills using engaging mental tasks.17 At this time, a majority of cognitive training programs are delivered using a computer, including Cogmed RoboMemo,18 Play Attention!,19 or BrainTrain.20 Although some research on computerized cognitive training for participants with ADHD reports improvements in hyperactivity and attention skills,21 and reduction in attention difficulties,20,22 the evidence from randomized controlled trials is beyond equivocal. Because of this, cognitive training for ADHD was classified with “experimental” status in a review of evidence-based treatments for the disorder.14 Indeed, improvements in the trained tasks are frequently noted, but evidence of transfer effects or long-term benefits is scant.22,23 A meta-analysis of cognitive training outcomes for ADHD across 15 studies revealed improvements in trained tasks of working memory, but limited effects on ADHD symptoms and no transfer to untrained tasks, resulting in the authors’ suggestion that training techniques should target multiple neuropsychological processes across domains.24 In addition, training programs should include applications to tasks in the real world.

The current study aimed at examining the effects of such a program with children who have attention problems. We selected a training program that differs from those in the existing ADHD literature in two ways: it is delivered by a clinician rather than by a computer and it includes deliberate distractions designed to improve selective and divided attention skills. ThinkRx25 is a 60–72-hour clinician-delivered cognitive training program that targets multiple cognitive skills with game-like, but rigorous mental tasks in 60–90-minute training sessions at least 3 days per week. The intensity is tightly controlled by the clinician using a metronome, timer, and deliberate distractions to “load” the participant with several simultaneous tasks. Grounded in the widely accepted Cattell–Horn–Carrol (CHC) theory of cognition,26 the program is designed to target multiple skills including working memory, long-term memory, processing speed, logic and reasoning, visual processing, auditory processing, and attention. Given that multiple cognitive deficits are associated with ADHD, we hypothesized that interventions grounded in a comprehensive theory of intelligence may be well suited to addressing those multiple cognitive constructs. Further, using a cognitive training intervention that includes deliberate distractions challenges the existing paradigm that children with attention problems or ADHD need accommodations that remove distractions from their environment. The method of delivering the program one-on-one by a clinician is supported by Feuerstein’s theory of structural cognitive modifiability,27 which describes the malleability of intelligence from mediated interactions with environmental stimuli. That is, when an adult purposefully coaches a child during a learning experience, it builds the child’s capacity for learning and thinking.28 In prior research with children and adolescents with learning struggles, training with ThinkRx has yielded significant gains across cognitive constructs including working memory, long-term memory, fluid reasoning, processing speed, auditory processing, and visual processing.2932 Nonexperimental, but objective assessment data on the ThinkRx program with the analysis of pretest to posttest cognitive testing results for clients with attention problems and ADHD (n=5,416) have been positive including significant gains in long-term memory, working memory, processing speed, visual and auditory processing, fluid reasoning, and broad attention.33 However, a controlled trial on results with this population has not been conducted to our knowledge.

Although the program is delivered by a clinician, ThinkRx cognitive training differs from behavior therapy and traditional psychological interventions such as play therapy or talk therapy. A key difference from behavior therapy is the deliberate addition of distractions rather than the elimination of environmental stimuli. Further, the goal of cognitive training is to remediate the cognitive deficits frequently associated with ADHD rather than the emotional and behavioral manifestations of the disorder. However, in our clinical experience with the program, we have noted that these behaviors are mitigated as an unintentional, but pleasant side effect of the intervention. The aim of the current study was to examine those qualitative behavioral changes as reported by the parents as well as the quantitative cognitive changes measured by the Woodcock–Johnson III34 following ThinkRx cognitive training for children with attention problems in a clinic setting.

Patients and methods

Participants and group design

Participants were a subset of a larger published study.29 The sample was recruited through emails sent to a list of families who had inquired about brain training at the LearningRx brain training center in Colorado Springs in the 3 years prior to the study (n=2,241). Eligibility criteria included children aged 8–14 years who lived within commuting distance of the study and who scored between 70 and 130 on the General Intellectual Ability (GIA) composite35 of the Woodcock–Johnson III – Tests of Cognitive Abilities34 at the time of screening. Thirty-four families responded to the recruitment email. Thirty-two families had children who met the screening criteria. We used blocked sampling with both individuals and siblings to minimize attrition risk as well as contamination if siblings were not assigned to the same group. Participants were then randomly assigned to either a treatment group to complete 60 hours of cognitive training or a waitlist control group. This subset of the original study sample (n=39) used for the current study included 13 participants. The treatment group (n=6) included three females and three males, with a mean age of 10.3 years. The control group (n=7) had two females and five males, with a mean age of 11.0 years. All participants had a parent-reported diagnosis of ADHD, which was the basis for their selection for the current study. One participant in the treatment group was on medication for ADHD. His medication status remained stable throughout the study. No participants in the control group were on medication during the study. In accordance with the Declaration of Helsinki, parents provided written informed consent, and the minor participants assented to participating in the study. The study was approved by the Institutional Review Board (IRB) at the Gibson Institute of Cognitive Research under protocol #20150515.

Quantitative outcome measures

Supervised by a doctoral-level educational psychologist, clinicians at the master’s level who were blind to the treatment condition of the participants administered the Woodcock–Johnson III – Tests of Cognitive Abilities (subtests 1–7 and 10). The Woodcock–Johnson III test battery is firmly grounded in the CHC theory of cognition36 as is the intervention itself. Although the testing tasks are much different from the training tasks, it was important to choose an assessment that aligned with the constructs that the intervention was targeting. Trained research assistants administered the Flanker test from the National Institutes of Health (NIH) Toolbox Cognition Battery.37 The interval from pretest to posttest ranged from 13 to 16 weeks (M=14.4) for the treatment group and from 10 to 17 weeks (M=14.5) for the control group. Table 1 lists a description of the tests.

Table 1 Brief description of Woodcock–Johnson III tests and constructs measured
Note: All measures except for attention were from the WJ III – tests of cognitive abilities.
Abbreviations: GIA, General Intellectual Ability; NIH, National Institutes of Health; WJ III, Woodcock–Johnson III.

Qualitative outcome measures

Semistructured interviews were conducted with parents of each participant in the treatment group at the midpoint and on conclusion of the intervention. During these parent meetings, the first author collected responses to the following open-ended question: What specific changes have you seen in your child since the start of training? Responses were documented in writing by the researcher during the meetings. If parents gave a vague response, the researcher would follow up with the prompt, “Tell me what that looks like.” By leaving the question open-ended and nondirectional, this allowed parents to respond with either positive or negative changes. That is, we did not assume that all potential changes presented by the participants would be positive. While parents enrolled their children in the program with the goal of realizing improvement in attention, behavior, achievement, or other struggles evident in their children, it is possible that the participants would present either no changes or perhaps even negative changes, which could take the form of exacerbated struggles or the appearance of new ones entirely.

Intervention

Participants in the treatment group completed 40 training sessions including three or four 90-minute cognitive training sessions each week over the 15-week study period. Sessions were administered by five certified cognitive trainers at two locations: a cognitive training center and a cognitive science research laboratory setup with training rooms designed to mimic the environment at the training center. To be a cognitive trainer with this program, a minimum of a bachelor’s degree and 60 hours of training and mentoring in the program are required. For the current study, trainers’ education included a master’s degree in education (n=1), a master’s degree in cognitive neuroscience (n=1), a bachelor’s degree in psychology (n=2), and a bachelor’s degree in education (n=1). On-site master trainers monitored day-to-day program fidelity. All participants in the treatment group completed the required 60-hour protocol through attendance at all 40 training sessions. The waitlist control group participants began their cognitive training intervention following the treatment group’s completion of their training.

The cognitive training intervention used in the study was the commercially available program ThinkRx, a clinician-delivered intervention available at LearningRx Brain Training Centers (learningrx.com). The training has been extensively described in two prior manuscripts.29,30 Briefly, the training program is delivered one-on-one by a cognitive trainer using a 230-page curriculum of 23 training tasks that have >1,000 variations and difficulty levels. Sitting across a table from the participant, the trainer utilizes a metronome, shape and number cards, manipulatives, activity worksheets, and even a mini trampoline to deliver the program. Firmly grounded in the CHC theory of intelligence, which describes a multiple-construct view of cognition,36 the ThinkRx training tasks target multiple cognitive skills, including visual and auditory processing, working memory, long-term memory, processing speed, logic and reasoning, and attention. Training tasks are loaded with additional mental activities such as mathematical calculations, counting aloud on beat, or answering questions aloud while sustaining attention to a visual or an auditory task. Trainers provide constant feedback while the participants progress through each level of difficulty. Figure 1 shows an example of a training task. This task targets working memory while simultaneously training sustained attention, visual span, visual discrimination, and processing speed. The clinician arranges cards into a pattern on a grid, and the participant studies the pattern for 3 seconds. Then, the clinician covers his side of the work board, and the participant must reproduce the pattern of cards from memory while counting aloud to the beat of the metronome. There are nine levels of increasing difficulty and 34 variations in this training procedure.

Figure 1 Example of a clinician-delivered working memory training procedure.

ThinkRx cognitive training diverges from traditional attention deficit interventions in a primary way: the use of deliberate distractions. Instead of removing distractions from the environment, this intervention integrates deliberate distractions into every aspect of the training environment. For example, the metronome is used in most tasks to train divided attention. Further, trainers try to distract participants during a task by walking around the participant, making funny faces or sounds, singing a song, clapping to a different beat than the metronome, or saying the wrong answers while the participant is responding. In addition to delivering the training in an open space with 10–15 clients at a time, the use of deliberate distractions is designed to mimic the real world where children with attention problems or ADHD are inundated by distractions and external stimuli. In the clinic, clients must adapt to training amidst the noise of multiple conversations, ticking metronomes, and buzzing timers.

Another distinction between the ThinkRx program and other brain training programs is the relationship with the trainer. Unlike computerized games, the training tasks are delivered by a human being who gives dynamic feedback and constantly adjusts the training session to increase the intensity as the client masters a task or decrease the intensity to adjust for frustration or fatigue. A key component of the training session is a review of the client’s goals and a conversation about the application of training gains outside of the training environment. This metacognitive activity is designed to help the client see how the skills translate to their lives and increase motivation in setting and reaching training goals.38 Consistent with research on the importance of the therapeutic relationship to treatment efficacy,39 this dynamic is a critical part of the training model. An example of this might start by a client identifying a real-life academic goal of being able to learn the abbreviations of all the elements on the periodic table – a common objective for high school chemistry classes. Learning the abbreviations of >100 chemical elements requires strong memory skills. One memory training task in the ThinkRx program helps clients use visualization and association to learn the names of all 45 presidents of the USA and recite them from memory backward and forward in less than a minute. This training task targets not only long-term memory but also visual processing and processing speed. After the client masters the task, the trainer could revisit the client’s original goal of learning the abbreviations of all the chemical elements and brainstorm with the client how he could apply his new visualization technique to reach this academic goal.

Another way that the trainer might help the client apply training techniques to the real world is by helping him identify which cognitive skills are used in various activities and how strengthening those skills might benefit the client outside of the training environment. For example, a client who plays baseball might set a goal of improving his batting average. The trainer would first help him identify the cognitive skills needed for batting such as visual processing, processing speed and reaction time, and even logic and prediction skills. During the training sessions, the trainer identifies which skills the client is developing with each training task and at the end of the session revisits how these skills are used related to his goals. For this client, she might say, “Today, you mastered level 8 of this logic and reasoning task. Excellent job! How might you use this type of logic strategy when you are at bat?”

Not only do trainers help clients think about how to apply their new skills to everyday activities outside the training environment, they also help them identify and articulate improvements that they are seeing at home, in school, and in their extracurricular activities. The beginning of every training session is spent identifying and documenting how clients are applying their new skills in the real world. They might identify such things as improved grades or sports play, remaining engaged for longer periods, remembering to turn in homework, improved relationships with siblings or friends, being recognized at school for taking a leadership role, reading a map better on a scouting excursion, or cleaning their room without being asked. These conversations help motivate the client, help the client apply the training to their lives, and encourage a positive relationship between the trainer and the client.

Data analysis

Because a traditional parametric analysis of variance is not appropriate for a small sample size,40 all between-group test data were analyzed using nonparametric tests. To check the randomization and ensure statistical equivalency between groups, we first analyzed between-group differences in demographics with Mann–Whitney and χ2 tests. Next, differences in dependent measures were measured between groups by using Mann–Whitney tests. The dependent variables included the difference scores between the pretest and posttest for each measure. That is, we used a difference-in-difference analysis on all the variables. Because there were nine comparisons for each measure, we applied a Bonferroni correction for multiple comparisons. Effect sizes were calculated using r.

Next, treatment group test data were analyzed for clinically significant change and the reliable change index (RCI). To determine the clinical significance of the training gains for individual participants in the treatment group, we used a two-part procedure. The first indicator of clinical significance is the change in score from a clinical level to a level one would expect from a healthy individual, in this case the age-matched normative database. Using the Jacobson–Truax method,41 we determined the cut-point of the healthy population for each measure, or the value above which a score is most likely to fall in the healthy population distribution of scores. The cut-point is calculated algebraically, taking the form:

where Md is the study sample mean; SDd is the study sample SD; Mn is the Woodcock–Johnson age-matched standardization sample mean; and SDn is the Woodcock–Johnson age-matched standardization sample SD.

Next, we calculated whether the magnitude of the change is statistically reliable using RCI for each participant. This index indicates the change in the individual beyond that which might be expected by chance due to variability in a testing instrument. Using the standard error of the difference in a classical measurement theory of RCI formula, any changes exceeding 1.96 times the standard error are not likely to occur >5% of the time. The formula for calculating the RCI took the form:

where X1 is the participant’s Woodcock–Johnson pretest standard W score; X2 is the participant’s Woodcock–Johnson posttest standard W score; and Sdiff is the standard error of the difference score.

Finally, we conducted a thematic analysis on the qualitative data from parents at mid-training and posttraining interviews. We evaluated parent meeting notes using inductive thematic analysis, a process of carefully evaluating data with the goal of allowing the experiential responses to combine into descriptive themes of the phenomenon.42 This approach provides the opportunity to discern emerging themes43 without prior bias or researcher expectations of specified outcomes.

We first collected the interview comments into a simple document and formatted them in direct quotation phrases without evaluative truncation. Then, two coders independently went through the data and developed coding schemes, color-coding the responses according to reported changes in day-to-day behaviors and interactions. After a thorough analysis of the 54 comments, the coders sorted them into major themes. Then, they met to discuss similarities and differences in their coding schemes – an important step in analytical triangulation of qualitative data. After reaching consensus, we used the primary themes to describe how parents perceived changes in their children after cognitive training.

Results

Fidelity and group equivalency

All the participants in this analysis completed the study. All six members of the treatment group completed the required 60 hours of cognitive training. There were no significant differences between the groups on demographic variables (age: U=17.0, p=0.56; gender: χ2=0.63, p=0.43). Data screening revealed no missing data, and skewness was in tolerable ranges for all variables. Finally, there were no statistically significant differences between the treatment and control groups on pretest scores (associative memory: U=16.0, p=0.53; visual processing: U=12.5, p=0.23; auditory processing: U=16.0, p=0.53; processing speed: U=18.0, p=0.73; logic and reasoning: U=10.0, p=0.14; intelligence quotient (IQ): U=11.5, p=0.10; long-term memory: U=21.0, p=0.18; attention: U=14.0, p=0.37) except working memory (U=4.0, p=0.01). Such null differences confirm that the randomization process created groups that were statistically equivalent in the pretraining period, allowing for greater confidence that findings were associated with the intervention and not with systematic differences in group characteristics.

Results of statistical testing of dependent measures

Participants in the treatment group showed greater median difference scores on all measures as compared to the control group except for visual processing (Figure 2). Control group scores show participants got negative difference scores on four measures and a median difference score of 0 on a fifth measure. Thus, on only four measures, the control group got positive difference scores.

Figure 2 Comparison of treatment and control groups on median pretest to posttest change in test scores.
Abbreviation: GIA, General Intellectual Ability.

Table 2 provides precise details of the descriptive statistics (medians, mean values, confidence intervals, and SDs). When examining differences in the difference scores (the final column), the greatest gap was between groups on “long-term memory” and “logic and reasoning,” with the smallest gaps noted in “visual processing and attention.”

Table 2 Descriptive statistics for pretest to posttest change scores and overall difference between groups
Abbreviation: IQ, intelligence quotient.

Mann–Whitney results (Table 3) show that the difference between groups was statistically significant on five measures – auditory processing, logic and reasoning, working memory, long-term memory, and IQ score – using a conventional α (0.05). However, after the Bonferroni correction, which set the α at 0.006, the difference in auditory processing was no longer significant. Turning to effect sizes, we calculated the r-approximation as the appropriate value for use with Mann–Whitney nonparametric tests.44 The greatest effect of the intervention was measured on IQ score and working memory, followed by long-term memory and logic and reasoning. All three found large effects. The smallest effect was measured on visual processing.

Table 3 Statistical comparisons between treatment and control groups for each construct measured
Abbreviations: IQ, intelligence quotient; p, probability; r, effect size; U, Mann–Whitney U value.

Results of clinical significance testing and RCI

Table 4 illustrates the results of determining the clinical significance of the changes from pretest to posttest. The cut scores for each variable are included as a reference, and we annotated the posttest scores meeting the cut score threshold. As noted, 100% of the GIA composite posttest scores, the long-term memory posttest scores, logic and reasoning posttest scores, working memory posttest scores, and visual processing posttest scores met the threshold for 95% probability of occurring in a normal population.

Table 4 Cut score thresholds and clinically significant change in Woodcock–Johnson W scores
Notes: aPosttest scores met cut score threshold for clinically significant change. Case A: 11y male, Case B: 9y female, Case C: 10y female, Case D: 10y male, Case E: 11y male, Case F: 11y female.

Table 5 illustrates the RCI of each measure for each participant. The qualitative descriptions printed under each RCI are defined as follows: participants are considered “recovered” if their posttest score met the cut score threshold and the RCI was statistically reliable, or >1.96. RCIs >1.96 but final scores that do not meet the cut score threshold are classified as “improved.” RCIs between 1.96 and −1.96 are considered “unchanged.” RCIs <−1.96 are classified as “deteriorated.”

Table 5 Magnitude of change by case and construct measured
Notes: aSignificant reliable change index >1.96. Case A: 11y male, Case B: 9y female, Case C: 10y female, Case D: 10y male, Case E: 11y male, Case F: 11y female.
Abbreviations: D, deteriorated; R, recovered; U, unchanged.

All six treatment group participants obtained a significant clinical change and significant RCI on GIA indicating overall recovery effects from the intervention. Excluding the GIA composite scores, 31 of the 48 subtest score changes were clinically significant and revealed recovery (65%) across participants. Fourteen remain unchanged (29%), and two deteriorated from pretest levels (4%). The rates of recovery on individual subtest scores ranged from 33% to 100%. On long-term memory, 100% (six out of six) showed recovery. In associative memory, logic and reasoning, and working memory tests, 83% (five out of six) showed recovery. In tests of visual processing, processing speed, and attention, 50% (three out of six) showed recovery. The lowest recovery percentage was on auditory processing, with 33% (two out of six), showing recovery.

Results of qualitative thematic analysis of reported improvements

Turning to qualitative results, analyses revealed three themes of parent-reported changes in the treatment group participants following cognitive training: confidence, self-discipline, and cooperation. We present these in order of frequency of mention by respondents.

Confidence

Parents of five of the six participants in the treatment group reported changes in “confidence or self-esteem,” saying, for example, “(Her) confidence level is also very high. You can tell she believes in herself and wants to do her best at tasks,” and “(He) doesn’t seem to be intimidated by new challenges or events.” One parent said, “He has stopped asking for confirmation on his comments. He used to make a comment like, ‘Cheetahs are the fastest animal in the world, right Dad?’ He now has more confidence with his statements and will tell you why a cheetah is the fastest animal in the world.” Another said, “When we go on hikes, (He) has become the leader of the pack, confident in his new direction skills.” One-third (18 of 54) of all comments made by parents were related to confidence and self-esteem.

Self-discipline

Parents of four of the six treatment group participants reported changes in self-discipline saying things such as “fewer meltdowns with less intensity and shorter duration,” “improved motivation for chores,” “has accomplished her homework without really being told,” and “(He) has taken more responsibility when given a task or a job. He’s been taking care of the neighbor’s dog and we never once had to remind him.” One-third (18 of 54) of all comments were coded as changes in self-discipline.

Cooperation

Parents of four of the six treatment group participants reported examples of cooperativeness saying things such as “(She) has been a miracle the last few months. Her outlook at home has completely changed. She picks up after herself, does her chores without being told, and does tasks around the house without even being asked,” “One of the biggest changes we’ve seen in her is her manners have really been great. She asks for permission when she would like to do something or would like something,” “Remembering to make her bed and other chores,” and “(He has been) more cooperative and not as many fits.” Nearly 30% (15 of 54) of all comments were related to the theme of cooperation.

Other reported changes

Although no additional dominant themes emerged from the data analysis, parents noted several other types of changes and concerns, some of which were not positive. Two parents reported a desire for more changes, saying things such as, “(He) is still struggling with memory and attention,” and “I would like to see more focus at home.” Another parent indicated she was disappointed that the intervention had not changed her child’s negative behaviors at home, saying, “He is defiant and chooses his own behavior.” However, two parents noted positive changes in academic performance, and two parents indicated that their children’s sleep habits had improved.

Discussion and conclusion

The aim of the current study was to examine the quantitative cognitive effects and parent-reported behavioral effects of ThinkRx, a clinician-delivered cognitive training program with children who have attention problems. Our main finding from the between-group analyses is that the ThinkRx cognitive training intervention improved cognitive skills for the treatment group, and the control group was outperformed by the treatment group on the mean scores of all variables. This finding is consistent with our prior research on the ThinkRx program where we found significant differences between treatment and control groups on multiple cognitive measures.2931 In the current study, we found statistically significant differences between groups on measures of auditory processing, logic and reasoning, working memory, long-term memory, and IQ score with large effect sizes. We did not find statistically significant differences between groups on visual processing or attention. However, as noted in the larger study,29 the statistical results of the attention analysis may reflect the NIH Toolbox Flanker test psychometric limitations. The convergent validity of the test with the Delis–Kaplan Executive Function System (D-KEFS) Inhibition Test45 was just 0.34 for ages 8–15 years, and significant practice effects were found in the pediatric validation study.37

Because this sample was a clinical subset of a larger randomized controlled trial, we also examined clinical significance for individual treatment group participants. Consistent with the growing and critical trend of reporting clinically significant change in research on psychological measures,46 we examined a quantifiable measure of participants’ meaningful change and return to normal functioning. Overall, all six treatment group participants could be classified as “recovered” given their clinically significant changes in IQ scores. All six treatment group participants achieved clinically significant changes in at least five cognitive constructs.

Findings from both the between-group analyses and the within-person clinical significance testing are unique compared with studies of computer-based cognitive training programs used with similar samples. As in our study, improvements in untrained tasks of working memory have been reported across studies with children who have attention problems,2124 but to our knowledge, this is the first randomized controlled study on a cognitive training program with children who have attention problems to document improvements in untrained tasks of auditory processing, logic and reasoning, processing speed, and long-term memory in addition to working memory. Given that cognitive profiles in children with attention problems and ADHD frequently include deficits in working memory, long-term memory, and processing speed,47 our findings have important clinical implications. That is, the ThinkRx program may have the potential to address the heterogeneous cognitive deficits for this population. An intervention that targets multiple cognitive constructs should necessarily lead to improvement in multiple cognitive constructs.

The proposed mechanism of change in the current study rests not only on the assumption of neuroplasticity – that the brain can change with experience – previously demonstrated in functional magnetic resonance imaging studies of ThinkRx program,48,49 but also on the intensity and targeted nature of the training program. For example, we found significant between-group differences in processing speed. Not only do 12 of the ThinkRx training procedures directly target processing speed, nearly all the remaining training procedures also include the use of a stopwatch to promote increasingly faster response times or completion rates for each level of the task. Almost all the training procedures also use a metronome. The speed of responses in the early levels of each task begins at 60 beats per minute but gradually increases to 160 beats per minute as training progresses. Thus, the program is designed to improve speed of cognitive processing. As another example, we also found significant between-group differences on the measure of logic and reasoning. This is most likely due to the focus of five training procedures designed to improve reasoning skills including sequential processing, planning, problem-solving, inductive and deductive logic, and causal reasoning. Although the training tasks were qualitatively different from the assessment tasks, the training equips the participants with skills that can be used to tackle other measures of the same constructs.

The thematic analysis of qualitative data produced several noteworthy findings particularly that the training effects appeared to be associated with real-world improvements in confidence, self-discipline, and cooperation. This is particularly striking because even though the ThinkRx protocol is designed to address cognitive deficits – and these were the reasons parents sought out the intervention – parents of all six treatment group participants reported at least one type of behavioral improvement, a form of improvement they had no reason to expect to observe or to report. This finding is in stark contrast to the lack of transfer in the existing literature on cognitive training interventions.50 To our knowledge, this is the first study to document improvements in confidence, self-discipline, and cooperation following cognitive training for children with attention problems. Further, it is interesting to note the similarities between observations made by parents and the typical responses given by participants during the metacognitive activity of the intervention. Although participant comments during the metacognitive activities were not a formal part of the analysis, we nonetheless examined them to determine the extent to which they were consistent with the themes of our qualitative analysis. Consistency was quite high. For example, parents reported increased cooperation, which aligned with improvements in several of the participants identified during the training. One participant spoke at length about helping his sister without being asked. These same similarities were found with reported changes in self-discipline. Participants gave their trainers examples of improved self-discipline, such as cleaning the kitchen, showering, and packing for a trip without being asked. These comparisons are, of course, anecdotal and limited in their generalizability, since participants were only asked to identify improvements, but parents were asked to identify changes of any kind. Nevertheless, the consistency between parent and participant observations was notable and worthy of future research.

An obvious limitation of the current study is the small size of the sample. However, including clinically significant change indices and qualitative data adds robustness to the reported results. Another limitation of the study is that the ADHD diagnosis was reported by the parents rather than established by the researchers. Although all the participants had been previously diagnosed by a physician or psychologist as having ADHD, we mitigated this potential concern by limiting our classification of them to having attention difficulties rather than ADHD. However, it is important to note that there are precedents in the extant research that support the use of parent-reported diagnoses in research inclusion. For example, Visser et al reported that parent-reported diagnosis of ADHD among 590 children in California had strong convergent validity with no statistically significant difference from the diagnosis reported in health insurance records and concluded that parent-reported diagnosis of ADHD is appropriate for monitoring state and national prevalence of the diagnosis.51 In a similar study, Warnell et al examined the validity of parent-reported diagnosis of autism spectrum disorder (ASD) among a large sample (n=1,000) of the first children in the UK to enroll in an ASD research database.52 The authors found a 96% agreement between parent- and physician-reported diagnosis and concluded that it was appropriate to use parent-reported diagnosis for inclusion in the ASD research database. Nevertheless, future research on ThinkRx cognitive training for ADHD should include pre- and post-objective ADHD-specific measures of symptoms such as the Swanson, Nolan, & Pellham Teacher and Parent Rating Scale (SNAP-IV)53 or the Behavior Rating Inventory of Executive Function54 completed by both parents and teachers and a larger sample size.

Another potential concern from readers may be that the ability to test for placebo effects in the current study is limited due to the use of a waitlist control group rather than an active control group. To mitigate the risk of placebo or expectancy effects, we did not tell the participants that there was a control group. We simply told them they would be assigned to either a summer group or a fall group to start their training program. In addition, extant research on this possibility indicates that placebo effects are unlikely in cognitive training studies. Mahncke et al addressed the placebo effect using two control groups (passive and active) and found no placebo effect.55 That is, there was no difference in outcomes between the two types of controls. Similar results were also found in other cognitive training studies with two control groups.56,57 Further, two meta-analyses of cognitive training studies (n=35) revealed no difference in outcomes between types of control groups.58,59 Therefore, we conclude that expectancy or placebo effects were unlikely in the current study.

A final limitation to the current study is the lack of follow-up testing. We did complete follow-up testing with many of the participants in the larger study. However, because we used a waitlist control group, participants in both groups had been through the intervention at follow-up testing. Therefore, we could not compare the outcomes between groups. Further, three of the six participants with ADHD in the treatment group failed to show up for follow-up testing. However, a separate study assessing long-term outcomes from cognitive training would be an important contribution to the field.

Additional areas to explore in future research with the ThinkRx program include a comparison with youth who have failed to respond to medication, who have partially responded to medication, and who have no ADHD symptoms at all. A randomized controlled trial with a large sample size comprised of children at the portal of entry to clinical care is also indicated.

The results of this study provide early support for the use of the ThinkRx cognitive training program in remediating cognitive skills in children and adolescents with attention problems. The results are consistent with larger studies on the program and with the nonexperimental research data reported on >5,000 LearningRx clients with attention problems and ADHD.33 The findings of both cognitive and behavioral benefits are an encouraging and noteworthy contribution to the cognitive training literature and to clinical practice. Finally, the use of deliberate distractions in an intervention for attention problems separates this cognitive training program from the traditional paradigm of accommodating the environment for children with attention difficulties and ADHD – an approach worthy of further exploration.

Acknowledgment

This study was sponsored by research and development funds from LearningRx.

Disclosure

ALM and TMM are employed by the Gibson Institute of Cognitive Research, the nonprofit research arm of the intervention described in this article. However, they had no financial stake in the outcome of the study. The authors report no other conflicts of interest in this work.


References

1.

Rabiner DL, Godwin J, Dodge KA. Predicting academic achievement and attainment: the contribution of early academic skills, attention difficulties, and social competence. School Psychol Rev. 2016;45(2):250–267.

2.

American Psychiatric Association. Diagnostic and Statistical Manual of Mental Disorders. 5th ed. Arlington: American Psychiatric Association; 2013.

3.

Barkley RA. The Executive Functions: What They Are, How They Work, and Why They Evolved. New York: Guilford Press; 2012.

4.

Brown TE. Executive functions and attention deficit hyperactivity disorder: implications of two conflicting views. Int J Disabil Dev Educ. 2006;53(1):35–46.

5.

Martel M, Nikolas M, Nigg JT. Executive function in adolescents with ADHD. J Am Acad Child Adolesc Psychiatry. 2007;46(11):1437–1444.

6.

McQuade JD, Tomb M, Hoza B, Waschbusch DA, Hurt EA, Vaughn AJ. Cognitive deficits and positively biased self-perceptions in children with ADHD. J Abnorm Child Psychol. 2011;39(2):307–319.

7.

Martinussen R, Hayden J, Hogg-Johnson S, Tannock R. A meta-analysis of working memory impairments in children with attention deficit/hyperactivity disorder. J Am Acad Child Adolesc Psychiatry. 2005;44(4):377–384.

8.

Lewandowski LJ, Lovett BJ, Parolin R, Gordon M, Codding R. Extended time accommodations and the mathematics performance of students with and without ADHD. J Psychoeduc Assess. 2007;25(1):17–28.

9.

Barkley RA, Murphy KR, Fisher M. ADHD in Adults: What the Science Says. New York: Guilford Press; 2008.

10.

Harpin VA. The effect of ADHD on the life of an individual, their family, and community from preschool to adult life. Arch Dis Child. 2005;90(Suppl 1):i2–i7.

11.

Glatz T, Stattin H, Kerr M. Parents’ reactions to youths’ hyperactivity, impulsivity, and attention problems. J Abnorm Child Psychol. 2011;39(8):1125–1135.

12.

Rabiner DL, Malone PS; Conduct Problems Prevention Research Group. The impact of tutoring on early reading achievement for children with and without attention problems. J Abnorm Child Psychol. 2004;32(3):273–284.

13.

Subcommittee on Attention-Deficit/Hyperactivity Disorder; Steering Committee on Quality Improvement and Management, Wolraich M, et al. ADHD: clinical practice guideline for the diagnosis, evaluation, and treatment of attention deficit hyperactivity disorder in children and adolescents. Pediatrics. 2011;128(5):1007–1022.

14.

Evans SW, Owens JS, Bunford N. Evidence-based psychosocial treatments for children and adolescents with attention-deficit/hyperactivity disorder. J Clin Child Psychol. 2014;43(4):527–551.

15.

Halperin JM, Healey DM. The influences of environmental enrichment, cognitive enhancement, and physical exercise on brain development: can we alter the developmental trajectory of ADHD? Neurosci Biobehav Rev. 2011;35(3):621–634.

16.

Brown RB, Gerbarg PL. Non-drug Treatments for ADHD. New York: W.W. Norton and Company; 2012.

17.

Egeland J, Aarlien AK, Saunes BK. Few effects of far transfer of working memory training in ADHD: a randomized controlled trial. PLoS One. 2013;8(10):e75660.

18.

Tamm L, Epstein JN, Peugh JL, Nakonezny PA, Hughes CW. Preliminary data suggesting the efficacy of attention training for school-aged children with ADHD. Dev Cogn Neurosci. 2013;4:16–28.

19.

Steiner NJ, Sheldrick RC, Gotthelf D, Perrin EC. Computer-based attention training in the schools for children with attention deficit/hyperactivity disorder: a preliminary trial. Clin Pediatr (Phila). 2011;50(7):615–622.

20.

van der Oord S, Ponsioen AJ, Geurts HM, Ten Brink EL, Prins PJ. A pilot study of the efficacy of a computerized executive functioning remediation training with game elements for children with ADHD in an outpatient setting: outcome on parent-and-teacher-rated executive functioning and ADHD behavior. J Atten Disord. 2014;18(8):699–712.

21.

Rabiner DL, Murray DW, Skinner AT, Malone PS. A randomized trial of two promising computer-based interventions for students with attention difficulties. J Abnorm Child Psychol. 2010;38(1):131–142.

22.

Beck SJ, Hanson CA, Puffenberger SS, Benninger KL, Benninger WB. A controlled trial of working memory training for children and adolescents with ADHD. J Clin Child Adolesc Psychol. 2010;39(6):825–836.

23.

Mawjee K, Woltering S, Tannock R. Working memory training in post-secondary students with ADHD: a randomized controlled study. PLoS One. 2015;10(9):e0137173.

24.

Cortese S, Ferrin M, Brandies D, et al. Cognitive training for attention deficit hyperactivity disorder. A meta-analysis of clinical and neuropsychological outcomes from randomized controlled trials. J Am Acad Child Adolesc Psychiatry. 2015;54(3):164–174.

25.

Gibson K, Mitchell T, Tenpas D. ThinkRx: Cognitive Training Procedures Workbook. Colorado Springs: LearningRx; 2003.

26.

McGrew K. The Cattell-Horn-Carroll theory of cognitive abilities. In: Flanagan DP, Harrison PL, editors. Contemporary Intellectual Assessment: Theories, Tests, and Issues. New York: Guilford; 2005:151–179.

27.

Feuerstein R, Feuerstein RS, Falik LH. Beyond Smarter: Mediated Learning and the Brain’s Capacity for Change. New York: Teacher’s College Press; 2010.

28.

Kozulin A, Lebeer J, Madella-Noja A, et al. Cognitive modifiability of children with developmental disabilities: a multicenter study using Feuerstein’s Instrumental Enrichment–Basic program. Res Dev Disabil. 2010;31(2):551–559.

29.

Carpenter D, Ledbetter C, Moore AL. LearningRx cognitive training effects in children ages 8–14: a randomized controlled study. Appl Cogn Psychol. 2016;30(5):815–826.

30.

Gibson K, Carpenter D, Moore AL, Mitchell T. Training the brain to learn: beyond vision therapy. Vis Dev Rehab. 2015;1(2):119–128.

31.

Jedlicka E. LearningRx cognitive training for children and adolescents ages 5–18: effects on academic skills, behavior, and cognition. Front Educ. 2017;2(62).

32.

Hill OW, Serpell Z, Faison MO. The efficacy of the LearningRx cognitive training program: modality and transfer effects. J Exp Educ. 2016;84(3):600–620.

33.

Wainer H, Moore AL. LearningRx Client Outcomes and Research Results. Colorado Springs: Gibson Institute of Cognitive Research; 2016.

34.

Woodcock RW, McGrew KS, Mather N. Woodcock Johnson III Tests of Cognitive Abilities. Rolling Meadows: Riverside; 2007.

35.

McGrew KS, Schrank FA, Woodcock RW. Technical Manual, Woodcock-Johnson III Normative Update. Rolling Meadows: Riverside; 2007.

36.

McGrew KS. CHC theory and the human cognitive abilities project: standing on the shoulders of the giants of psychometric intelligence research. Intelligence. 2009;37(1):1–10.

37.

Zelazo PD, Anderson JE, Richler J, Wallner-Allen K, Beaumont JL, Weintraub S. NIH Toolbox Cognition Battery (CB): measuring executive function and attention. Monogr Soc Res Child Dev. 2013;78(4):16–33.

38.

Barron KE, Evans SW, Baranik LE, Serpell ZN, Buvinger E. Achievement goals of students with ADHD. Learn Disabil Q. 2006;29(3):137–158.

39.

Ardito RB, Rabellino D. Therapeutic alliance and outcome of psychotherapy: historical excursus, measurements, and prospects for research. Front Psychol. 2011;2:270.

40.

Van Voorhis CR, Morgan BL. Understanding power and rules of thumb for determining sample sizes. Tutor Quant Methods Psychol. 2007;3(2):43–50.

41.

Jacobson NS, Truax P. Clinical significance: a statistical approach to defining meaningful change in psychotherapy research. J Consult Clin Psychol. 1991;59(1):12–19.

42.

Percy WH, Kostere K, Kostere S. Generic qualitative research in psychology. Qual Rep. 2015;20(2):76–85.

43.

Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: a hybrid approach of inductive and deductive coding and theme development. Int J Qual Methods. 2006;5(1):80–92.

44.

Rovai AP, Baker JD, Ponton MK. Social Science Research Design and Statistics. 2nd ed. Chesapeake: Watertree Press; 2014.

45.

Delis DC, Kaplan E, Kramer JH. Delis-Kaplan Executive Function System. Bloomington, MN: Pearson Clinical; 2001.

46.

Atkins DC, Bedics JD, McGlinchey JB, Beauchaine TP. Assessing clinical significance: does it matter which method we use? J Consult Clin Psychol. 2005;73(5):982–989.

47.

Moore AL, Ledbetter C. Beyond attention: memory and processing speed deficits dominate cognitive profiles in ADHD across the lifespan. Poster presented at: American Psychological Association Annual Convention; August 3–6, 2017; Washington, DC.

48.

Ledbetter C, Faison MO, Patterson J. Correlation of cognitive training gains and resting state functional connectivity. Poster presented at: Society for Neuroscience; November 12–16, 2016; San Diego, CA.

49.

Moore AL, Ledbetter C, Carpenter D. MRI and neuropsychological outcomes following cognitive rehabilitation training in traumatic brain injury: a multiple case study. Poster presented at: Society for Neuroscience; November 11–15, 2017; Washington, DC.

50.

Melby-Lervag M, Redick TS, Hulme C. Working memory training does not improve performance on measures of intelligence or other measures of “far transfer”: evidence from a meta-analytic review. Perspect Psychol Sci. 2016;11(4):512–534.

51.

Visser SN, Danielson ML, Bitsko RH, Perou R, Blumberg SJ. Convergent validity of parent-reported ADHD diagnosis: a cross-study comparison. JAMA Pediatr. 2013;167(7):674–675.

52.

Warnell F, George B, McConachie H, Johnson M, Hardy R, Parr JR. Designing and recruiting to UK autism spectrum disorder research databases: do they include representative children with valid ASD diagnoses? BMJ Open. 2015;5(9):e008625.

53.

Bussing R, Fernandez M, Marwood M, et al. Parent and teacher SNAP-IV ratings of attention deficit hyperactivity disorder symptoms. Assessment. 2008;15(3):317–328.

54.

Gioia GA, Isquith PK, Guy SC, Kenworthy L. Behavior Rating Inventory of Executive Function. 2nd ed. Lutz, FL: PAR Inc.; 2005.

55.

Mahncke HW, Connor BB, Appelman J, et al. Memory enhancement in health older adults using a brain plasticity-based training program: a randomized controlled study. Proc Nat Acad of Sci U S A. 2006;103(33):12523–12528.

56.

Burki CN, Ludwig C, Chicherio C, de Ribaupierre A. Individual differences in cognitive plasticity: an investigation of training curves in younger and older adults. Psychol Res. 2014;78(6):821–835.

57.

Dunning DL, Holmes J, Gathercole SE. Does working memory training lead to generalized improvements in children with low working memory? A randomized controlled trial. Dev Sci. 2013;16(6):915–925.

58.

Au J, Sheehan E, Tsai N, Duncan GJ, Buschkuehl M, Jaeggi SM. Improving fluid intelligence with training on working memory: a meta-analysis. Psychon Bull Rev. 2015;22(2):366–377.

59.

Peng P, Miller AC. Does attention training work? A selective meta-analysis to explore the effects of attention training and moderators. Learn Individ Diff. 2016;45:77–87.

Creative Commons License © 2018 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.