Cramming with retrieval practice

Yes, it beats rereading. Here is the catch every guide skips.

You already know the result. Karpicke and Roediger 2008 again, the same paper every studying-advice post cites: about 80% retention one week later for retrieval-practice students, about 34% for reread-only. The gap holds even from a single cramming session. So you should be making practice questions instead of rereading.

The part nobody addresses is who is supposed to write the questions. You have 30 lectures left, six hours, and a midterm at 9 a.m. Hand-authoring practice questions from a 90-slide deck takes one to two hours. So you reread, get the 34%, and tell yourself you studied. This page is about the missing step between the cognitive science and a 1 a.m. cram session.

Skip to the one-night protocol →
M
Matthew Diakonov
9 min read

Direct answer · verified 2026-05-02

Do retrieval practice questions beat rereading when cramming?

Yes. In the Karpicke and Roediger 2008 paradigm cited across the studying-strategy literature, retrieval-practice students retained roughly 80% of the material one week later vs roughly 34% for reread-only students. That gap shows up even from a single cramming session. The honest caveat: cramming with retrieval still loses to the same total time spread across weeks (the spacing effect), so retrieval-cramming is the right move when the exam is tomorrow, not the right move when the exam is in three weeks. The primary-source guide (Agarwal et al.) is the retrievalpractice.org research summary.

Why rereading feels like studying and isn't

On the second pass through a slide deck, the words feel familiar. Cognitive psychologists call that the illusion of fluency: your brain reads familiarity as mastery, and you walk out of the session feeling like you understand the material. The exam then asks you a question where the slide isn't in front of you, and the recognition you practiced doesn't transfer to retrieval. You knew it when you could see it. You don't know it when you have to pull it from memory.

Retrieval practice fixes this by doing the thing the exam will do: asking you for the fact with the slide hidden. The act of trying (even when you fail) physically strengthens the path back to the fact. The metacognitive payoff is bigger than the retention payoff: a missed retrieval question tells you exactly which slide you don't know, which is information rereading never gives you because rereading feels uniformly good across material you know and material you don't.

The asymmetry is what kills the rereader. Retrieval practice is unpleasant; rereading is comfortable. Cramming hours are when you have the least willpower to choose the unpleasant option. So the tool you use has to make retrieval the path of least resistance, not the harder option you have to summon discipline for.

The bottleneck no studying-strategy article addresses

The retrieval-practice argument has been written a thousand times. The Association for Psychological Science wrote it. Pocket Prep wrote it. The retrievalpractice.org PDF guide (Agarwal and colleagues) wrote it the most thoroughly. They all end with the same recommendation: instead of rereading, make practice questions.

And then they stop. Nobody writes the next paragraph, which is the arithmetic. A 90-slide lecture deck takes one to two hours to hand-author into 100 practice cards. Med students, nursing students, dental students, pharmacy students all carry 5 to 10 decks per week minimum. The crammer typically has 20 to 40 PDFs left, the night before the exam, after weeks of telling themselves they would start sooner.

The advice "make practice questions instead of rereading" is, for this reader, "spend 30 to 60 hours producing study materials before you start studying, in the 6 hours you have left." It is unactionable advice. The crammer rereads, gets the 34%, and the studying-strategy literature posts the same article again next year.

The interesting question is: what would it take to make retrieval practice the actually-easier option in the cramming-night situation? The bar is low. The questions have to exist before the drilling can start, and producing them has to fit in the gap between deciding to cram and falling asleep at the desk.

60s

A 90-slide cardiology lecture, four question formats per fact slide, about 200 cards in roughly 60 seconds of conversion. The cognitive-science answer to a cram night was always 'make practice questions.' This is what removing the writing step looks like.

Studyly conversion pipeline, measured on a typical 90-slide PPTX

What it takes to make retrieval the easier option

Three things have to be true before retrieval-practice cramming becomes the path of least resistance. The studying-strategy literature treats all three as the reader's problem. They are actually tooling problems.

Question authoring has to drop from hours to minutes. The crammer needs the questions to exist by the time the coffee is done brewing. Studyly's conversion pipeline produces around 200 questions from a 90-slide deck in about 60 seconds. For a 30-PDF dump that is roughly 30 minutes of unattended conversion if you upload serially, less if you upload in parallel. The hand-authoring equivalent is on the order of 30 to 60 hours.

The questions have to be good, not just present. A bad MCQ (one obviously-correct answer, three nonsensical distractors) tests recognition the same way rereading does, just with extra steps. The four-criterion rubric Studyly publishes on the quality page (factual correctness, stem clarity, distractor plausibility, question-type coverage) runs as a per-card gate before each card ships. The same rubric and methodology produced an 81.3 score on the held-out three-document eval; the closest non-Studyly competitor scored 78.0, the most popular alternative scored 57.8.

The session has to stay survivable past 1 a.m. Cramming sessions die from boredom and exhaustion, not from running out of material. The visible-progress mechanic (one tree per deck, decks chain into a river, a card-block grows the tree one stage) is the part that keeps the crammer at the desk for the last two hours. The cognitive-science articles never mention this because it isn't a cognitive-science variable. It is the largest predictor of whether the session finishes at all.

Rereading vs cramming with retrieval questions, side by side

The honest comparison. Rereading has one column it wins: cognitive cost when you are exhausted. That is also why it doesn't work.

FeatureRereading the slidesCramming with retrieval questions
Retention 1 week later (Karpicke & Roediger 2008 paradigm)About 34%About 80%
Practices the skill the exam testsRecognition (familiar feeling on the page)Retrieval (pull the fact from cold memory)
Surfaces what you don't knowHidden under the illusion of fluencyEvery miss is a labeled gap with a slide-number pointer
Time per 90-slide lecture30 to 60 minutes of skim-rereading60 seconds to convert + the time you spend drilling
Holds up on take #5 of the same materialYes (you are still rereading, learning nothing new)Yes (auto-rephrased stems, reshuffled distractors)
Cognitive cost when you are exhaustedLow (which is why it feels good and does nothing)High (which is what makes it work)

The numbers behind the argument

Two pairs. The first pair is the well-replicated retention gap from the cognitive psychology literature. The second pair is the question-quality eval Studyly publishes; that one matters because retrieval cramming with bad questions degrades to recognition practice, which degrades to rereading.

0%Retrieval-practice retention (1 week later)
0%Reread-only retention (1 week later)
0Studyly question-quality score (eval)
0Turbolearn on the same eval

Retention numbers: Karpicke and Roediger 2008 paradigm, summarized across the studying-strategy literature. Question-quality eval methodology and rubric definitions are on /quality.

A one-night cramming protocol that actually uses retrieval

The five steps below assume you have one night, an unread pile, and the unfortunate combination of caffeine and time pressure. Each step is a specific behavior, not a vibe.

If the exam is in 6 hours

1

Dump everything in

Every PDF, every slide deck, every YouTube lecture link you have left. The conversion runs in parallel; a 90-slide deck takes about 60 seconds, four question formats per fact slide. Don't babysit it. Walk away, make coffee.

2

Baseline the first deck on MCQ

Run the deck end-to-end on multiple-choice. You're not trying to ace it; you're labeling which facts you don't know yet. The miss list is the actual study plan for the rest of the night.

3

Re-drill the misses on free-response or case-style

Switch the missed cards to free-response (no options to recognize against) or case-style (the fact embedded in a clinical scenario). This is where retrieval beats recognition. The same fact, drilled with the options removed, forces a cold pull.

4

On a wrong answer, read only the cited slide

Studyly cites the source slide number on every miss. Open slide 23, read the 4 bullets on it, close the slide. Do not reread the deck end-to-end between attempts. That is the rereading failure mode you came here to escape.

5

Move on to the next deck once the misses thin out

When the queue is mostly cards you got right twice in a row, you've extracted what cramming can extract from this deck. Moving on costs less than re-drilling the same correct cards another five times. The tree on the deck advances; one tree per deck, all visible from the dashboard.

Anchor fact · why the writing-step removal matters

30 PDFs in. ~6,000 retrieval cards out. About 30 minutes.

The Cramming Procrastinator persona Studyly built around isn't a marketing fiction; it's the modal use case. Around 200 cards per 90-slide deck across four formats (MCQ, free-response, case-style, image-occlusion on labeled diagrams). For a 30-PDF pile, that is on the order of 6,000 retrieval-format cards produced in about half an hour of unattended conversion time. A human authoring the same volume by hand at one card per minute would need 100 hours.

The reason this number matters is that it changes which option is the path of least resistance. As long as making retrieval questions costs hours, the exhausted crammer reaches for the highlighter. Once the cost drops below the cost of opening a second tab, retrieval becomes the easier choice and the cognitive-science prescription becomes actionable for the only population that needed it most.

Skip these

Activities that feel like cramming and are not.

Familiar but useless

  • Re-highlighting the slides in a different color (you've already done this)
  • Watching another lecture video at 2x to 'review' (passive recognition again)
  • Recopying notes onto fresh paper (transcription is not retrieval)
  • Reading the textbook chapter the lecture summarizes (you don't have time)
  • Making a study guide document (writing about the material is not retrieving the material)

Do these

The actual high-yield moves for the hours you have left.

Worth your remaining hours

  • Generate retrieval-format cards from every PDF you have left
  • Drill cold (no peeking at notes between misses)
  • Read only the cited source slide on a wrong answer
  • Switch missed MCQs to free-response or case-style after the first pass
  • Sleep at least 4 hours; consolidation needs sleep more than it needs another hour of drilling

When this advice is wrong

A few honest cases where retrieval-practice cramming is not the right answer.

  • You have three weeks, not one night. Spread the same retrieval drills across the three weeks. The spacing effect roughly doubles long-term retention vs the same total time massed in a cram session. If you cram a deck tonight that you'll be tested on three weeks from now, you'll have lost most of it by then.
  • Your exam is computational. Step-by-step problem solving (calculus, physics, dose calculations, organic chemistry mechanisms) wants a problem-solver, not a concept-recall tool. Retrieval-practice questions test whether you can name an enzyme; they don't teach you to balance a redox half-reaction. Use a worked-solution tool for those.
  • You don't have the source material. Retrieval practice on something you have never seen the source of degrades to guessing. If your professor gave no slides and you took no notes, your first hour goes into producing source material to convert (a textbook chapter, a recorded lecture, a classmate's deck), not into drilling.
  • You are at the falling-asleep-at-the-desk threshold. Memory consolidation requires sleep. Below about four hours of sleep, additional drilling stops adding retention and starts costing it. The studying-strategy literature is unanimous on this and the studying-advice blogs hate to print it.

Related guides on this site

Drop tonight's pile in

About 60 seconds per deck. Then you start drilling.

Free tier on app.jungleai.com, no credit card. Email gate sends a one-click access link.

Common questions about cramming with retrieval practice

Does retrieval practice actually beat rereading when you are cramming?

Yes, by a wide margin in retention, but with one caveat. Karpicke and Roediger's 2008 study (the one most studying-advice articles cite) found that students who used retrieval practice retained about 80% of the material one week later, vs about 34% for students who only reread. That gap shows up even with a single cramming session. The caveat is that cramming with retrieval still underperforms the same total study time spread out over weeks (the spacing effect). If your exam is tomorrow, retrieval-practice cramming is the highest-yield thing you can do; if your exam is in three weeks, do the same thing but spread it across the three weeks.

Why doesn't rereading work, even though it feels productive?

Rereading produces what cognitive psychologists call the illusion of fluency. The text feels familiar on the second pass, your eyes glide over it, and your brain reads familiarity as mastery. The exam doesn't ask you to recognize the material on a page; it asks you to retrieve a fact you cannot currently see. The skill of retrieving is different from the skill of recognizing, and you can only practice it by trying to retrieve. Highlighting and rereading optimize for the wrong skill.

What's the bottleneck nobody mentions about cramming with retrieval questions?

The questions have to come from somewhere. Every cognitive-science blog on this topic ends with 'make your own practice questions' as if that's a free action. It is not. Hand-authoring 100 cards from a 90-slide lecture deck is a one-to-two-hour task per deck. If you have 30 lectures left to cover, that is 30 to 60 hours of writing questions before you have started studying. The crammer doesn't have those hours; the crammer has six. So they reread instead, get the 34% retention, and tell themselves they studied.

Is ChatGPT a substitute for a real practice-question generator when cramming?

It will produce questions, and on a single PDF that's better than rereading. The problems show up at scale. ChatGPT does not enforce a quality rubric on what it emits, so distractors are often implausible (one obviously wrong, three obviously right) and stems often paraphrase the answer. It does not track which questions you got right, so the deck doesn't adapt. And it returns the same wording on the second take, which means by your third pass you are pattern-matching the question, not retrieving the fact. Fine for one PDF on a calm afternoon; not the right tool for 30 PDFs at 1 a.m.

How is Studyly's cramming flow different from making questions in ChatGPT?

Three concrete differences. First, the four-criterion rubric (factual correctness, stem clarity, distractor plausibility, question-type coverage) runs on every generated card before it ships and on every revisit-time rephrase. Studyly scored 81.3 on a held-out three-document eval where Turbolearn scored 57.8 on the same documents and rubric. Second, the stem is auto-rephrased and distractors reshuffled on revisit, so taking the same deck five times in one night is still five different tests rather than one wording memorized five ways. Third, every card cites the source slide number on a wrong answer; you go back to slide 23 to fix the gap, not back to the entire 90-slide deck.

How long does Studyly take on a 30-PDF dump?

Roughly 60 seconds per 90-slide deck for 200 questions, four formats per slide (MCQ, free-response, case-style, image-occlusion on diagrams). For 30 decks that is on the order of 30 minutes wall-clock if you upload them serially, less if you upload in parallel. Then you start drilling. Compare to ~30 to 60 hours of hand-authoring the same volume.

Why a tree per deck if I'm just trying to cram for tomorrow?

Because cramming sessions die from boredom, not from a question bank running dry. The tree-per-deck mechanic is the visual loop that makes a 1 a.m. session bearable: every block of cards moves the tree forward a stage, every deck completed adds a tree to the river, and you can see the cram session physically progressing on the screen. The pure pedagogy reason is unrelated: visible progress is one of the most reliable interventions against quitting a study session early. The product-mechanics reason is the same one mobile games use.

Does this work if my source material is a textbook PDF, not lecture slides?

Yes. The intake stage normalizes PPTX, KEY, PDF (born-digital and scanned), and YouTube lecture transcripts to the same per-section internal representation. The four generators don't care which file type the section came from. Where it falls down is computational problem sets (worked equations, dose calculations) where you need a step-by-step solver, not a concept-recall question. Studyly handles concept questions; it does not show work on integrals.

What about cramming for USMLE, NCLEX, or boards specifically?

The four-format generation works equally well on board-style materials, with one practical note: case-style stems are the highest-leverage format for board-style exams because they mirror the test's actual question shape. The case-style generator runs on every fact slide regardless of source, so a 90-slide cardiology deck produces around 50 case-style stems alongside the MCQs and free-response cards. You can filter the queue to case-style only when you want to drill clinical reasoning instead of recognition.

What's the right way to cram with retrieval questions if I have one night?

Five-step version. (1) Dump every PDF and slide deck you have left into Studyly and let the conversion run; while it runs, you don't have to babysit it. (2) Drill the first deck end-to-end on MCQ format, get a baseline of what you don't know. (3) Switch to case-style or free-response on the cards you missed; this forces cold recall instead of recognition. (4) When you miss a card, click into the source-slide citation and read just that slide, then close it. Don't reread the whole deck. (5) Move to the next deck. The retrieval-with-feedback loop is the active ingredient; rereading entire decks between attempts re-introduces the rereading failure mode.