Get In Touch


Curating for Learning: An Easy(ish) Step-by-Step Guide

By Julia Huprich
Director of Curation Strategy

Like many other L&D professionals in the world, you may be wondering how you can use the wealth of freely available, high-quality, informative content online to your advantage. I believe that you can follow an easy curation process to accomplish three different objectives: to enhance the existing e-learning you already have, to complement your in-person classes, and, my favorite way, to develop brand-new courses by leveraging content that’s been curated from around the web. If you’re curious about how you can replace your old, outdated training courses with fresh, new, curated content, then continue on, dear reader.    

The ADCIE Model: A Step-by-Step Guide

When I began curating for learning, there were several general guides for curation, including those developed by Robin Good, Harold Jarche, Beth Kanter, and others; however, I found none that specifically addressed how to leverage this content for instructional purposes. How did I build a course using curated content? How did I organize the content? What did I do, exactly? Personally, I was looking for a step-by-step guide to building a course using exclusively curated content, which I could not find. (And, as a former librarian, I’m fairly decent at finding information.) So, I created my own model, which {drumroll please} I present to you now. Introducing… the ADCIE Model.

{Insert record-scratching sound here.} Wait a minute, does that say ADCIE Model? As in, the ADDIE Model? It does. You’re probably familiar with ADDIE. It’s a good instructional design model. There’s no reason to recreate the wheel here… I took this existing model and made it my own, so some of the steps will seem somewhat familiar. Here’s how it goes.


The Analysis phase of the model is essentially the same as the one you’re familiar with now. During this step, you’ll answer a few questions:

  • Who’s the audience? What do they know, and not know?
  • What are the instructional goals for the course? What should they know, or be able to do, after the completion of the course?
  • What kind of content would resonate with this audience? What kind of content should be avoided?
  • When working with a client, I also ask: is there proprietary content that needs to be included here? How will that be worked in? What content would the SME (subject matter expert) like to have included?


Now that we’ve gathered a bit of data during the Analysis phase, we’ll use this knowledge to design the learning objectives for our course. What, exactly, does the content need to cover? What are the behavioral objectives? What principles should be represented in the content, and what is the basic, foundational information that needs to be included to support those principles? Breaking down each knowledge objective is imperative in this step, as it’s key to success during the Curation phase.


This, my friends, is where the magic happens. The goal of this step is to identify high-quality resources that, when organized together, create a cohesive instructional narrative. How does one do that? I created an easy, albeit long, process:

1) Identify criteria for desired content:

  • Content scope: Who will be engaging with the content? What level should it be at, and what concepts is the content expected to cover?
  • Format: What is the format that is expected?

2) Identify search terms for initial search; use performance measurement or SME-provided learning objectives.

3) Conduct initial search.

4) Using the results, identify content domain/scope of search. What is the general content area? Is this accurate, or does there need to be a modification to the original search terms?

5) Conduct a cursory evaluation of the results from initial search. Some evaluation criteria to consider:

  • Authority: Are there any specific authors or sources that appear more often than others? Are they reputable?
  • Validity: Is content represented in initial articles/results related to performance measure, task, subtask, or evidence of mastery information from SME? Verify validity of search results and evaluate search strategy. Are other keywords needed?
  • Currency: Is content current? If not, what are the current terms used?

6) Aggregate items that meet initial content criteria; there are lots of tools to do this, but my favorite is (Of course.)

7) Identify search terms for additional searches.

  • Are there terms related to the initial search term?
  • Are there tools or processes specific to the learning objective that should be included?
  • Are there foundational aspects of the initial search term that can inform the learner or basic concepts that should be explained further?

8) Conduct additional searches, using terms identified in step #5, as well as terms included in the task, subtasks, evidence of mastery, and SME interview.

9) Repeat steps 4–8. Keep going until you’ve got a collection of resources that you feel address the learning objectives. But don’t stop there…

10) Evaluate all aggregated content using selection criteria modified from Publication 98 of the Council on Library and Information Resources at the Digital Library Federation, Building Sustainable Collections of Free Third-Party Web Resources; I use criteria that include the following, but feel free to develop your own:

  • Information coverage: subject matter, types of resources, types of sources, level of difficulty
  • Validity: research, data sources, references, bias, motive
  • Authority/reputation: information source, author’s credentials, organization/publication, filter/referees, referral source, validated authorship
  • Accuracy
  • Comprehensiveness
  • Uniqueness
  • Pedagogical quality
  • Objectivity
  • Currency
  • Appearance

11) Remove any content that does not meet selection criteria; repeat steps 4–8 as needed.

12) Organize assets in a logical fashion. For the courses I build for clients and the paths I curate at Catcat, I break content into sections and provide descriptions for the learner about each section. It’s important to provide learners with context about what they’re reading/watching/learning and why. When you’re organizing the content, determine:

  • Are there related content domains that are included that should be excluded?
  • Are there related topics that are mentioned in a specific order?
  • Is there foundational knowledge that needs to be understood prior to engaging with a specific piece of content?
  • Is there a cohesive instructional narrative? Does the content, when sequenced together, tell a story for the learner?

13) Review all content to ensure that:

  • Learning objectives have been met
  • A variety of formats, sources, and authors have been presented (no back-to-back long podcasts, for example)
  • Content is not repetitive

At this point, you may need more content. That’s ok. Go back to step 4. It’s an iterative process — you’ll repeat these steps as often as you need to get a course that makes sense for the learner.

14) Once you’ve aggregated, evaluated, and organized all of your content, review your course with the SME. Is the content delivered appropriate for the learning need?

15) Breathe. You’re done. (Well, almost.)


At this point, you should have a course that looks something like So go ahead, launch your course! Share it with the world! Be proud! But don’t forget the final step…


You can implement a number of evaluation strategies here. Maybe you want to test users on their knowledge (a Kirkpatrick Level 2 evaluation, for those of you following along at home). Or maybe you want to send users a survey 3 months after their completion of your course to determine how it changed their behavior (Level 3). No matter how high you go in the Kirkpatrick evaluation scale, you’re going to want to ensure you at least allow users to give you Level 1 information: was the content you curated favorable? Engaging? Relevant? You’ll want to revisit previous steps if the results aren’t in your favor. You’ll also want to make sure that learners have a mechanism for letting you know if the links you’ve curated are broken, have moved, or are no longer relevant. Which brings me to a bonus step:

Updating Your Course

Information changes all the time. You may want to set up a content aggregator, like Anders Pink, to stay on top of all of the new content that’s being published on a specific topic. No matter what tool you use, you’ll want to review your course and replace old, outdated content with fresh, new stuff, and you’ll want to do this review on a regular basis. Delivering old content is no fun.

The End

So, there you go. That’s my model for developing a course, from scratch, using curated content. Try it out and let me know what you think. Do you have a different method? Is there something else that should be considered here? As an information professional, I’m always open to learning something new — so send me your ideas, constructive criticism, and helpful hints.