Telling Ain’t Training – Pt 7: Workplace Reference Guide?

Final Verdict

In short, Telling Ain’t Training is a valuable reference guide for anyone who is in a training or informal education setting. If for nothing else, the authors do a fantastic job explaining why SMEs don’t always make the best teachers and providing guidance around how to get them where they need to be. Additionally, the chapter(s) on technology provide some critical and salient advice about what to keep in mind when deciding if your training will benefit from technology, most notable the potential benefits and the pitfalls you might run into. You can pick up a copy on Amazon here.

If you’re looking for activities that you can incorporate into training or adult learning sessions, I highly recommend Marcia L. Tate’s Sit and Get Won’t Grow Dendrites: 20 Professional Learning Strategies That Engage the Adult Brain (you can buy it here on Amazon).

And in case you missed it, check out the other six posts in this series:

Telling Ain’t Training – Pt 6: Technology Integration

Overview

During this final post, I’ll cover the potential benefits of integrating technology in training. There are several considerations, most importantly the impact they can have on the efficacy of the training. Stolovitch and Keeps summarize these factors in saying,

When it comes to training efficiency, the measure is fast and cheap. When it comes to training and effectiveness, the measure is how well the learning goal is achieved.

Telling Ain’t Training, pg 181

Technology can help to meet these two metrics, as long as you understand that the use of technology to deliver content does not replace solid training design. The use of technology in training “can enable efficiency” if properly implemented. It can also take a turn towards gimmicky if the use isn’t well thought out or well executed.

Working at a tech education company and matrix managing dispersed teams, leveraging technology is a constant part of our everyday. When I started, I was onboarded to dozens of systems with no real explanation to why or, really, their uses. Some stuck because of their prevalence (Slack, for example) and/or their functionality (Google Docs). A few systems have found their want into my personal life, most notably in my Instructional Resources Trello board. I’ve continued to explore our current technologies in order to leverage and expand our utilization of existing resources, including leading remote workshops using shared Google Slides decks and Zoom video software. I’ve even tried my hand at using a free LMS, Latitude Learning, to start hosting content.

As you can see, the list of technologies you can incorporate into training can get really long, really fast, and we haven’t even included the old standbys like Adobe Acrobat, the Microsoft Suite, Google Hangouts, Survey Monkey, and services like Moodle. With all of these ‘productivity tools’ floating around, it’s helpful to have a framework around what they can do to assist and then start narrowing down your specific use case.

What can you get out of integrating technology?

Chapter 10 of Telling Ain’t Training focuses on the use of technology in trainings, why you might consider using them and some of the caveats you must face. There’s a great chart on pages 184-186 (which I’ve summarized below) that outlines the potential benefits of trainings. I also encourage you to read that chapter to find out why promises around increased productivity and reduced costs from outside vendors may be too good to be true.

Potential Benefits What it means
Accessibility  Anyone can access it from anywhere, can help reach remote teams and provide opportunity to train people requiring accessibility accommodations
Instantaneous response and feedback  Instructors and participants and contact each other and receive near instantaneous responses; allows for automatic responses or feedback based on preset criteria
Instantaneous testing and feedback  Testing can be hosted and created within certain platforms – this is especially true of multiple choice questions, or trainings in which the program is both synchronous and asynchronous
Consistency of message  Templates, training and one delivery platform can result in a more consistent message that can be monitored and maintained by a relatively small team
Rapidity of delivery  It can reduce the need to coordinate in-person trainings; eliminated the need to schedule spaces and allow for people to join when needed
Simultaneity of training delivery  It can provide a platform to provide one training to a large number of participants
Ease of update Since all resources would live within a system or platform, it can reduce versioning issues often seen with static documents; updates can be pushed at one time to ensure everyone gets it at the same time
Reusability  Trainings can be delivered over and over again without a reset period; content can be repurposed for other trainings
Flexibility of use  Utilize all of pieces of a platform, use it for all or part of the training, use it for different types of training, hosts modules, pathways etc for different types of content.
Interactivity  Include audio, video, slides, Prezis, responsive tests and websites
Adaptability Depending on the platform, content can be changed (scaled, updated, amended, appended) to fit into other trainings/programs; , provide dynamic content that responds to learners needs

The absolute most important thing you must remember is that these benefits are conditional. They aren’t guaranteed and are heavily reliant on your current resources, your company’s infrastructure, setup costs – including training of internal users and onboarding – and time constraints, among a volley of other factors. For these reasons, content rather than delivery method should be the deciding factor in whether a piece of a technology should be incorporated into a program.

Telling Ain’t Training – Pt 5: Training in a Collaborative Workplace

I’m going to tell you a not-so-secret. Training adults is a game of social circles and politics. At one of my employers, training required a lot of buy in from different groups and participants generally wanted to fell like they were actively contributing to the event, rather than be on the receiving end of a final product. The dynamic can be challenging – how can you have a training if the people being trained believe that they know all there is to know?

There are a couple of things at play here. In today’s business environment, it’s all about your title and your tenure. If you’re not in a management position, it can be hard to get people to follow your lead. I’ll save my leadership lessons for later but for the purpose of this post, we’ll focus on the idea of collaborating with participants to deliver a successful training.

Chapter 8 of Stolovich and Keeps’s Telling Ain’t Training presents 25 scenarios you can use to add practical application to your trainings. I was planning an upcoming workshop around managing student concerns on campus and was excited to try some of them out. I flipped through each of them, eager to try something new. As I skimmed the exercises, I realized that none of them would work for me.

Why?

Because our team, dispersed across 15 cities and 3 continents, knows what they’re doing and they don’t want someone trying to get them to learn something by rote. What this book presents is, in it’s truest form, training. Reading through the scenario, I realized that what I wanted was a workshop. I wanted an event that had true learner participation and had a tangible end result.

I ended up with the a format that was predominantly learner led with me giving confirming and/or corrective feedback and taking notes when someone brought up a suggestion that aligned with best practices. In addition to having participants share out and complete two practical application exercises, I also asked them to ‘help’ me come up with a guide that others can use to apply the standards set during this workshop to any situation.

Participant feedback was overwhelmingly positive and I felt confident that they would be able to immediately implement their learning in their day to day responsibilities. Further more, the deck is available for reference and a recorded version will be made so that remote campuses will be able to provide the workshop asynchronously.

Utilizing Confirming and Corrective Feedback

A Familiar Scenario

Imagine you spent an entire weekend writing a paper for your Instructional Design course. It’s a lot of work and you’re unfamiliar with the content. You dedicate a few hours to reviewing the syllabus, assignment description and resources and you feel pretty confident in your final result.

When you get the grades back, you’re shocked to see a C+ next to your name. Under the feedback section, you get the following comment from your instructor:

“You’re just not getting it. Reread the diagram on page 2 and then resubmit.”

As the learner, take a moment to jot down your internal reactions and external actions. I’ve shared mine in this chart:

External Actions Internal Reactions
Reread the syllabus Confusion
Revisit the diagram Frustration
Contact the instructor for clarification
Demotivation
Email classmates to ask them for help Resentment

Looking at the internal reactions, we can see that escalated quickly, didn’t it? I’m sure the instructor didn’t mean to imply that I, the learner, hadn’t done my due diligence in reading all relevant materials, but that’s what it feels like. I might reach out to other participants to discuss my confusion only to find that they felt the same way. A picture begins to emerge – one of miscommunication that, when repeated, can quickly snowball into a negative learning experience.

This example is applicable to any educational setting and is indicative of a few areas of contention that I have experience as both a learner and an educator. We’ll break down the example the statement to find out why it fails to be useful.

This statement implies that there is a flaw with the learner that prohibits them from grasping the content. It’s a variation of the old “try harder”, as if effort alone is all it takes to learn. Additionally, as described in a previous post, it can feel like a personal attack.

What’s not to love? It tells the learner where to look, which some might categorize as a helpful hint. The problem here is that the instructor doesn’t acknowledge their responsibility to expand on or further clarify the instructions. If the learner stared at the diagram for thirty additional minutes, would that somehow influence her comprehension? Should the instructor provide more context or other support to ensure the learner understands the content and that they remain motivated throughout the course or session?
The answer is yes, that is exactly the hallmark of a good educator and, if you’re prepped appropriately for the topic you’re teaching, it doesn’t take much to make adjustments.

Corrective and Confirming Feedback

Using the same scenario as above, imagine if you received this feedback instead:

Not quite, but you’re on the right track. You’ve done a good job of explaining x but y is missing. I recommend reading resources 1 and 2 again and using z to frame your answer.”
It takes a few more words, sure, but it accomplishes several things:
  • Sets a positive tone
  • Call out of what is right (because no one likes to be wrong all the time!)
  • Calls out areas of improvement (after the praise)
  • Suggests concrete ways to improve
  • Provides additional resources and/or context

Using a combination of corrective and confirming feedback empowers students to explore topics independently, while looking to instructors/facilitators/trainers to provide guidance and support. This doesn’t mean that you, the educator, needs to handhold, coddle or give all of the answers away. Instead, it shows that you respect your learners and their ability to learn in ways that are best for them, as well as showing your support for their educational journey.

Think of the last time you learned a new and complex topic. If someone had offered guiding tips and suggestions that help you relate the content to something you already know or frame it within the context of your current life, wouldn’t that have made the experience not only more enjoyable but more effective?

What do you think? Have you used corrective and/or confirming feedback? What have been the results? As a learner, what type of feedback are you used to receiving and how does it influence your learning experience?

Telling Ain’t Training – Pt 3: Training the Right Way

Overview

The first two posts in this series talk about what we generally encounter as trainers – what we might define as failures in ourselves or our learners. I also cover a few techniques you can quickly in implement for existing trainings or those instances when you need to supplement content. This chapter and post focus on building the correct foundation to avoid those altogether.
Take a moment to consider the quote above.
What does it mean? In essence, there are a set of guidelines that allow us to build effective trainings independent of the learning styles of participants. Let’s take a look at what they are.

Six Guidelines for Creating Successful Trainings

The Why We’ve talked about this quite a few times but it bears repeating. Learners need to know why they’re learning the content. If he/she/they places high value on the training and content, they are more likely to engage and retain information.
The What Do you know what you’re teaching? Can you articulate this what using specific learning objectives? They should be listed on the course description. Maybe on the syllabus or in the classroom. This sets of end goal for you and your learners.
The Structure “Humans seek order (pg 75)” is something I have come to realize working in operations and even more so as an educator. Order allows participants to quickly grasp patterns as well as connect previously held knowledge to newly-learned information.
The Response How do you plan to add interactions your sessions? Response refers to the way in which learner’s respond to learning the content you’re presenting. According to research, as well as Stolovitch and Keeps, this can take the form of “answering a question filling in a blank labeling something solving a problem making a decision or even discussing and arguing (pg 76)”.
The Feedback Feedback is information that learners receive about how on or off target they are. It comes from the facilitator or instructor, or from other environmental components (e.g. think about a chemical reaction during a science experiment or a red ‘x’ or green check mark during an online quiz). Research indicates that feedback should be immediately relevant to the task. Personal criticism, perceived or otherwise, decreases performance. Additionally, it should also be timely frequent and specific.
The Reward What does the learner get for successfully completing a task? Rewards work the same way as they did in childhood; they motivate learners to continue a desired behavior. The actual reward will vary but as long as a reward is perceived as valuable to the learner it will be a successful tool for a motivation

 

Telling Ain’t Training – Pt 2: Blame Isn’t the Solution

You’re Really Good at That! (Or How You Become a Trainer)

When I took my first position as a trainer over 10 years ago, I had no idea how complex it would be. In hindsight I can see how ill-prepared I was to create meaningful trainings. This isn’t to say they were terrible or ineffective, or that I wasn’t good at my job. It does mean that I had a lot to learn. Like many of you, my first job as a trainer resulted from being identified as an ‘expert’ in what I did. The criteria for this varies, but in most cases, your manager is impressed by the work you’re doing or you’ve been doing it so long that you know all the ins and outs of a system, process or business. In some industries you’re called a subject matter expert (SME). In others, your title might include Lead, Head or another similar attribute. These are words that signal that you’re well-versed in your craft. By being labeled any of these, you’ve been selected to pass your knowledge and methods on to the ‘next generation’ or maybe even your peers.

I’m Good at My Job, I Swear

You’re excited and you have tons of ideas, tips and tricks to share. You sit down and think about all of the things that make you good at your job. The final outcome is a training outline or version 1 of a training manual. Armed with your knowledge and any class resources you’ve put together, you deliver your first training session and…you bomb.
Your audience is confused, you’re frustrated and everyone is resentful of what they perceive as time wasted.
You take a moment to reflect, attempting to figure out what went wrong. I’ve been there, believe me. In these situations, our minds can start to use blame as a way to rationalize everything.
Maybe I’m not that good at my job or perhaps Simon just wasn’t trying hard enough. Our brains lead us to believe that somewhere along the lines, someone messed up.
This isn’t entirely untrue, but it assumes that the failure was a result of a person, rather than a system. If we remove you (the person) and the learners as the root cause of the failure, then you are left with content and delivery. Let’s start by looking at the three critical components of communication when designed a training.

Where’s the Breakdown?

The first two, defined by Stolovitch and Keeps as declarative and procedural knowledge, constitute the majority of your content. The third is the idea of Adult Learning principles and it deals with how you choose to relay those first two components to your audience.
  • Procedural: You process customer returns everyday for 2 years. It’s second nature to key in code 555 in order to bypass the three standard welcome screens. It takes nothing at all to complete the entire logging and refund event in 3 minutes. This is procedural knowledge. You can think of it as all of the manual tasks you can complete without thinking about it. For me, a good example is knitting a washcloth.

Take a moment and think about a task that you can do without putting too much thought into it. This excludes things like breathing and blinking!

  • Declarative: Now if I asked you to talk me through, how accurate do you think your first try will be? How long to do you think it will take you to describe it so that I can replicate the process flawlessly? The is called declarative knowledge.
Many experts (like you and me) rely on procedural knowledge to do our jobs. Translating procedural knowledge into declarative knowledge often breaks down, which can lead to that familiar feeling of failure for both parties.
To further complicate this, in trying to teach others to work the way that we work, we sometimes forget that adults have their own feelings and experiences related to learning new things. We can’t just say ‘do it exactly this’ and not expect push back, especially from those who have already done the job before.

 How Can I Improve My Trainings?

  • According to the authors of Telling Ain’t Training, an effective way to set the stage for your trainings is to address positive and negative experiences related to the topic at the start of the session. This will:
    • Diffuse a situation by acknowledging past experiences of the participants and
    • Allow you to gauge what participants know and how they approach problem solving.
  • Establish the why and how of the trainings. As discussed in other posts, relevance is extremely important to adult learners. They need to know why they are learning this content and how it can be immediately applied to make their jobs easier and/or more efficient. You also want to discuss how they’ll be learning to do these things.
  • Here’s where the Adult Learning Principles come in:
    • Integrate real world scenarios as a way to demonstrate practical application to your learners. This also allows them to participate and share their experiences in similar situations.
    • Include a resource list that participants can refer back to. This can include detailed (or simplified!) explanations of content covered during the course, or further reading related to the topic.
    • Consider Task Mapping – During an Instructional Design course I took in graduate school, I chose to create a training for managers based on their job description. The posting included things like ‘budgeting’ but because many people are promoted into the position, it’s not an inherent skill. By task mapping, I was able to uncover assumed prerequisite knowledge and build from that level before moving onto budgeting; for example, an understanding of financial terminology, a intermediate grasp of Excel and formulas and understanding of profit margins.

The next time you design a training, clearly define what you need to teach your learners and then use the suggestions above to decide how you’re going to do it. Throughout the process, be conscious of what you’re doing vs what you’re saying when outlining or developing content. Read it out loud to yourself and, if possible, ask someone else (a non-SME) to follow along. If they can’t, rework the areas where they get stuck until you have a resource that is thorough but easy to understand and follow along.

Stay tuned for my next post which explores the best ways to build a strong foundation for all your training needs!

Program Evaluation in Adult Learning Environments

A Little About this Project

Although this case study focused on my role and the product my company was selling, the key components of evaluation remain the same. As you read through this article, consider how you currently evaluate success in your role and in your company. How does it compare to what is discussed below? How robust is the evaluation process at your company? Is there any room for improvement? Ready, let’s start!

Evaluation – Defining Anticipated Outcomes

The evaluation is possibly the most critical aspect of running a program. Because not all problems can be adequately addressed while a program is running, it is imperative that an evaluation process is built into the creation of any program, regardless of the length. This allows you to gather participant data that’s outside the scope of personal success and instead gather data on other equally important components of what makes a program successful. Soliciting this information during or after a course allows program stakeholders to revisit their initial assumptions about what the course or program should accomplish and what it takes to reach those goals. By committing to the creation and implementation of an evaluation plan, you are committing to the continued success of your program.

Philosophy of Evaluation

Whereas assessments should focus on the role of the instructors in ensuring student success, the evaluation should be thought of in terms of gauging program health. The in-class assessments let us know if participants are grasping content and if not, what we can do to adjust our approach. There are other aspects, such as whether the program meets participant expectations, that are not captured through project rubrics or other classroom assessment techniques. This is where the concept of evaluation comes in.

Once complete, it’s important to review the program in its entirety to discover whether the program has served its purpose, whether there are areas of improvement that can be addressed prior to the start of the next program run and whether the program, in its current iteration, provides value to stakeholders. To do this, evaluations need to be:

  • Objective: You, the instructional designer, facilitator or other stakeholder, have invested a substantial amount of time in developing this program. It may be hard to hear that it’s not working the way it’s supposed to. It’s important that fear not keep you from asking those hard, direct and neutral (non-leading) questions.
  • Actionable:Regardless of the way it is collected, evaluation data must be actionable. At times, we may fall into the trap of asking participants completely subjective questions that do not have clear action items associated. Conversely, asking objective questions that only allow for a yes or no may not give you any useful data around suggested improvements.
  • Targeted: Know what you’re evaluating ahead of time and why. While you may be tempted to ask participants how they felt about the location of the school, unless you have the ability to change that, it may be a waste of time – yours and the participant’s – to include it. Questions related to program goals, instructional quality, course content and overall satisfaction all fall under the category of things your stakeholders will want to know. A rule of thumb is to determine key health metrics at the outset of the program and ask questions that speak to those items.

Which aspects of the program are being evaluated and how?

As discussed, what you intend to evaluate should be established prior to the start of the program. This is especially crucial if this is a pilot, as it is likely that there will be areas requiring immediate refinement. For this course, four core areas were identified. In order to determine whether this course was effective, we will evaluate whether participants:

  • Are learning what we expect them to learn
  • Believe the content is delivered in a way that is accessible and easy to understand
  • Can make clear connections between the content they are learning and the content’s relevance to their everyday lives and,
  • Would recommend this program, including the setup, content and instructors, in the future

Feedback will be collected via exit tickets (surveys), administered by the instructor at the end of each session. These surveys will consist of four questions. The exit tickets are meant to be a section by section snapshots evaluating the immediate impact of the course. At the completion of the course, a more comprehensive end-of-course survey will be administered.

Sample Evaluation #1 – Exit Tickets

Purpose Evaluate the quality of content and instruction at the end of each session
Deliverable Survey (hosted in Google docs) – the same form is used throughout the course, differentiated by the submission date.
Questions
  1. In one sentence, summarize the most important concept you learned today (Open text)
  2. The material covered during this session is immediately applicable to my job (Likert scale)
  3. The material covered during this session was presented in a clear and approachable manner (Likert scale)
  4. Is there anything else you’d like to add? (Open text)

Sample Evaluation #2 – End of Course Survey

Purpose Evaluated the quality of content and instruction of the overall program
Deliverable Survey (hosted in Google docs)
Questions Overall Experience

  1. Would you recommend this course to a friend of colleague? (NPS)
  2. What is the most important reason you gave us that score?

Content

  1. The material covered throughout this course met my expectations (Likert Scale)
  2. The material covered throughout this course was relevant to my job (Likert Scale)
  3. The material covered throughout this course is immediately applicable (Likert Scale)
  4. I understood what the requirements were for all of my assignments (Likert Scale)
  5. Is there anything you’d like us to know about the content? (Open text)

Delivery

  1. The instructor presented in a way that made the material easy to understand (Likert Scale)
  2. The instructor provided meaningful feedback (Likert Scale)
  3. I had the tools and resources I needed to participate in class and complete assignments (Likert Scale)
  4. Is there anything you’d like us know about the delivery (Open text)

 

What happens to the data?

During the Course

During the course, data collected from the exit tickets will evaluate whether the content is relevant and immediately applicable for participants. In addition, it will evaluate delivery techniques and allow program facilitators to provide supplemental materials in areas where the content is lacking. In the event that the issue lies with instruction, alternative methods can be explored, including but not limited to the introduction of blended or self-directed learning tools, additional AV equipment or even a change of venue. The purpose of the evaluations at the end of session and end-of-course is to cover as many modifiable external components as possible and establish a plan of action to be implemented during the current cohort or for the next.

After the Course

All data will be aggregated in an Excel file and used to create a dashboard that identifies trends, weaknesses and strengths of the program. The exit tickets and end-of-course survey responses will be coded and categorized into Content, Delivery, Tools and Environment and then into stages based on the length of time it will take to implement. Quantitative and qualitative data will be used to drive decisions around revising content and/or delivery methods and whether any additional financial investments will need to be made for tools or program development. Finally, data collected will inform what kind of training, if any, would benefit the instructor based on feedback around delivery. A timeline will be established for all improvements and the changes will be communicated to stakeholders, as well as future and former participants.

 

Understanding Your Learner

This is the first in a series building on the core concepts explored in Telling Ain’t Training. Click here to read the rest of the series.

Understanding Your Learner – What’s Your Approach?

When designing trainings, how often have you considered the learner? And in what capacity? Do you think about your delivery method? What about the classroom environment? A dozen things might go through your mind as you work off of your mental checklist. Before we get there let’s take a moment to think a little differently about what training means and align on what it should accomplish.

Telling Ain’t Training starts with a few key points centered around understanding your learners before they even step foot into your classroom, chief among them the tenant that we should be building trainings for the needs of the learners; investigating their roles, responsibilities and prior experience in order to build content that’s meaningful and relevant for them.

What Do You Want to Accomplish?

According to the authors, what we do falls into three categories:

  • Training – Is the goal to teach participants how to complete a step-by-step task?
  • Instruction – Is the goal to teach participants how to react in a situation with one or more variables?
  • Education – A culmination of life experiences and learning principles that go beyond reproducing or inferring; the road to expertise.
The purpose of training, instruction and education is to transform the learner, not transmit data. We want the learner to be able to apply what has been communicated and not just repeat it back.

Find Your Center of Focus

There’s a maxim repeated at the beginning of the book – educators must be “learner centered, performance based.” This encompasses not just your delivery but also the content you build, where you build it, and how you interact with participants. Lose sight of this and you risk losing your credibility and your learner’s interest and respect.

Learner Centered Means Adapting

How we learn is part of our genetic make up. Garden’s Theory of Multiple Intelligences tells us that we need to negotiate different senses and learning types in order to really make teachings stick. The question is how to cater to an audience you’ve never met. Educators can take advantage of what we know about the human body to build flexible courses that are designed to engage different types of learners.

Think About It!

Humans can store massive amounts of data. The issue lies in retrieving it. Assuming that it’s relevant to the learner, then organization is the key. Consider the acronym PEMDAS and the mnemonics My Very Educated Mother Just Served Us Nine Pizzas. Can you remember what they mean? If the answer is yes, when was the last time you needed to use that information? Chances are you haven’t consciously thought about either in a long time but the information still lives on. That’s the power of organization coupled with effective teaching and the human brain.

This, of course, doesn’t mean the classes where we learned these pieces of info were perfect but rather that someone stumbled upon a great memory technique that may or may not have translated into other parts of the curriculum. For example, I can’t readily recall most of what I learned in Earth Science but I vividly remember Algebra. The teacher included hands on and group activities, employed a reward system and used visuals and audio cues to draw connections between prior knowledge and newer, more complex pieces of information.

Put it in practice!

Make a list of ways you can engage your audience. Include exercises and content that appeal to the following:
  • Musical-rhythmic and harmonic.
  • Visual-spatial.
  • Verbal-linguistic.
  • Logical-mathematical.
  • Bodily-kinesthetic

Using Inter-team Collaborations to Promote Critical Thinking Skills

This is part of a 3-part series focusing on applying adult learning theory in the workplace. To see the other articles, view A Brief Intro to Adult Learning Theory and Self Directed Learning as a Training Solution

Current Problem

Teams across 15 campuses are finding it increasingly difficult to track information and share it with the stakeholders. In addition, there is a lack of standardization in the way programs are run, further complicating alignment and adoption of initiatives to solve this problem.

Proposed Solution

Many of these conversations have started with ‘we can’t’ or ‘we don’t’. We can’t change this system or we don’t use this method. I usually ask probing questions to get to the bottom of these opinions. Is the ‘can’t’ related to something tangible? Is the ‘don’t’ due to a lack of structure or something else? Until a problem can be broken down into it’s smallest parts, a holistic fix cannot be created. Asking teams to think critically about their current processes will force all parties to honestly evaluate the problems they face. I highly recommend using this list of Socratic Questions to get past the surface issues. Much like the worksheet found at the bottom of this page, you can create an evaluation worksheet asking similar guiding questions to reach a conclusion.

Why Promote Critical Thinking?

Human beings draw conclusions from past experience – personal or otherwise. We can become entrenched in views because we have had positive and negative experiences that influence what we think will happen. Perhaps we’ve were involved in an unpleasant outcome related to group work that prevents us from trying something new. Conversely, maybe we heard of a new technology and we want in on it because everyone else is raving about it. Either way, it’s necessary to break down and evaluate this feelings or instincts in order to make logical and well informed decisions. Even if they result is less than stellar, you have begun a process of evaluation that allows you to continue build until you reach success.

What Does it Look Like?

Start by creating a list of specific issues that your team is trying to tackle. If you are a manager, create your own and allow teams to do the same. For each issue, have team members brainstorm about the root causes and 2-3 possible solutions.  You can have participants fill out the worksheet to the right, use the template at the bottom of this page, or create your own.

It is unlikely that everyone will have the same answers, or answers that get to the bottom of the problem. Ask team members to share their view of the problem then use Socratic Questions to whittle it down to it’s simplest form. This might look something like this:

Stated Problem: There aren’t enough people or resources available to complete this project

Follow Up Questions

  • Can you give me an example of a time when this was apparent?
  • Can you describe the scope of the project? In what ways can we leverage the strengths of you and your team members to solve this problem?
  • How can we look at this another way?
Do the same for each section of the worksheet until there is a clear idea of what actionable steps can be taken to begin to solve the issue.

Once that has happened, have teams pair off with members of other teams to get feedback on how they handled similar situations. The idea is to have fresh perspectives challenge existing perceptions and require the problem solvers to re-evaluate, explain and if necessary, defend their beliefs and next course of action.

Will this work for me?

  • There is a question or issue that requires deeper exploration in order to resolve
  • You are looking to engage your audience and get them involved through intellectual contributions
  • You  need to change long held thoughts or processes but are facing resistance (change management)
  • You want to empower your audience to go beyond surface knowledge and use their skills and experience to develop their own ideas

Want to try it out? Use this PDF template to build a framework for applying any adult learning strategy to your current work.

A Brief Intro to Adult Learning Theory

There’s a lot of info about learning theory in the early years, but what about for adults?

That’s the first question every adult educator should start with. Unlike K-12 education, there aren’t strict governing bodies that inform every decision made in adult education. Instead, our community depends on years of independent and industry research as the basis for our practice. This means that there isn’t just one, or two, or even three ways in which we believe adults learn. In fact, on a whole, facilitators of adult education haven’t entirely agreed on what that term actually means. They have agreed, however, that a few key concepts are consistent when teaching adult learners.

Adult learners are looking for:

Can you break that down for me?

Of course! We’ll go piece by piece so that you can get an idea of how these seemingly simple components come together to form the complex field of adult education. Before we continue, remember that no one component is inherently more important than the other and there a billion other factors that determine why a student showed up to your class. Also, keep in mind that each learning situation is based on the circumstances and abilities of the instructor, the learners and the environment in which you teach.

Relevance of Content

Do you remember sitting in pre-calculus and desperately wondering why you were learning it? What’s the likelihood using advanced math in your everyday life? It didn’t really matter because someone decided you needed to learn it and so you did. Or I assume you tried to. If you’re like me, you didn’t retain anything after algebra because it held no relevance to you.

There are so many reasons someone shows up to your classroom; job mandated training, skill building for employment, individual pursuit of knowledge. Regardless of what got them there, your students are looking to learn something that means something. It’s important for you, as an educator, to identify that reason in order to ensure the success of your students.

Immediately Applicable Skills

Skills don’t always have to be manual, but for most adult learners, they do need to be immediately applicable. That means that what you teach today should be translatable to what your student does tomorrow.

Regardless of your audience, your content should aim to teach practical skills or knowledge in a way that is easy to relate to. Learners should know why they are being taught the content and how they can expect it to help them in their personal or professional lives.

Involvement in the Process

The most prevalent classroom structure in K-12 is teacher as leader. This means that the teacher, or person at the front of the room, makes all decisions about what and how content is learned. This can lead to passive learning, in which your audience only learns what you teach them with no consideration for their own interests, strengths or preferences.

Involvement level can vary based on any number of factors;  subject matter, government guidelines, time constraints, program structure and audience composition are just a few. Although it may initially sound challenging, there are simple ways to get everyone involved and invested in what is being taught. Having your learners share what they want to learn during the course or what projects they would like to work on, and then integrating that feedback into your lesson are just to examples of how this could work.

Acknowledgement and Inclusion of Prior Experience

One major difference between K-12 and adult education is that children are assumed to have no prior experience to build upon. This is not the case with adults, as discussed by Paulo Friere and to some extent, John Dewey.

The model of ‘educational banking’ does not translate well to higher education because adults are not empty vessels. They carry with them many years a experiences that shape the way they view the world and approach every situation. In order to keep them engaged, it’s important for educators to acknowledge this fact and look for ways to incorporate those experiences into the lesson. Asking learners to apply what they are currently learning to past experiences is an easy way to include student experiences in the classroom and help them understand the value of the content being taught.

Flexibility in the Way Content is Taught

We’re all familiar with the saying ‘one size doesn’t fit all’. That same idea applies to education. Although your learners need to learn the same content, it is unlikely that everyone will learn the same way at the same speed. There are several theories that address this including Gardner’s Theory of Multiple Intelligences, McClusky’s Theory of Margin and work done by Malcolm Knowles.

Although we have the benefit of technology, finding a video on Youtube or a link through Google does not guarantee learning. When we talk about the way content is taught, consider whether you are incorporating active learning techniques or content that integrates different learning styles.

This is part of a 3-part series focusing on applying adult learning theory in the workplace. To see the other articles, view Self Directed Learning as a Training Solution   and Using Inter-team Collaborations to Promote Critical Thinking Skills