Telling Ain’t Training – Pt 7: Workplace Reference Guide?

Final Verdict

In short, Telling Ain’t Training is a valuable reference guide for anyone who is in a training or informal education setting. If for nothing else, the authors do a fantastic job explaining why SMEs don’t always make the best teachers and providing guidance around how to get them where they need to be. Additionally, the chapter(s) on technology provide some critical and salient advice about what to keep in mind when deciding if your training will benefit from technology, most notable the potential benefits and the pitfalls you might run into. You can pick up a copy on Amazon here.

If you’re looking for activities that you can incorporate into training or adult learning sessions, I highly recommend Marcia L. Tate’s Sit and Get Won’t Grow Dendrites: 20 Professional Learning Strategies That Engage the Adult Brain (you can buy it here on Amazon).

And in case you missed it, check out the other six posts in this series:

Telling Ain’t Training – Pt 6: Technology Integration

Overview

During this final post, I’ll cover the potential benefits of integrating technology in training. There are several considerations, most importantly the impact they can have on the efficacy of the training. Stolovitch and Keeps summarize these factors in saying,

When it comes to training efficiency, the measure is fast and cheap. When it comes to training and effectiveness, the measure is how well the learning goal is achieved.

Telling Ain’t Training, pg 181

Technology can help to meet these two metrics, as long as you understand that the use of technology to deliver content does not replace solid training design. The use of technology in training “can enable efficiency” if properly implemented. It can also take a turn towards gimmicky if the use isn’t well thought out or well executed.

Working at a tech education company and matrix managing dispersed teams, leveraging technology is a constant part of our everyday. When I started, I was onboarded to dozens of systems with no real explanation to why or, really, their uses. Some stuck because of their prevalence (Slack, for example) and/or their functionality (Google Docs). A few systems have found their want into my personal life, most notably in my Instructional Resources Trello board. I’ve continued to explore our current technologies in order to leverage and expand our utilization of existing resources, including leading remote workshops using shared Google Slides decks and Zoom video software. I’ve even tried my hand at using a free LMS, Latitude Learning, to start hosting content.

As you can see, the list of technologies you can incorporate into training can get really long, really fast, and we haven’t even included the old standbys like Adobe Acrobat, the Microsoft Suite, Google Hangouts, Survey Monkey, and services like Moodle. With all of these ‘productivity tools’ floating around, it’s helpful to have a framework around what they can do to assist and then start narrowing down your specific use case.

What can you get out of integrating technology?

Chapter 10 of Telling Ain’t Training focuses on the use of technology in trainings, why you might consider using them and some of the caveats you must face. There’s a great chart on pages 184-186 (which I’ve summarized below) that outlines the potential benefits of trainings. I also encourage you to read that chapter to find out why promises around increased productivity and reduced costs from outside vendors may be too good to be true.

Potential Benefits What it means
Accessibility  Anyone can access it from anywhere, can help reach remote teams and provide opportunity to train people requiring accessibility accommodations
Instantaneous response and feedback  Instructors and participants and contact each other and receive near instantaneous responses; allows for automatic responses or feedback based on preset criteria
Instantaneous testing and feedback  Testing can be hosted and created within certain platforms – this is especially true of multiple choice questions, or trainings in which the program is both synchronous and asynchronous
Consistency of message  Templates, training and one delivery platform can result in a more consistent message that can be monitored and maintained by a relatively small team
Rapidity of delivery  It can reduce the need to coordinate in-person trainings; eliminated the need to schedule spaces and allow for people to join when needed
Simultaneity of training delivery  It can provide a platform to provide one training to a large number of participants
Ease of update Since all resources would live within a system or platform, it can reduce versioning issues often seen with static documents; updates can be pushed at one time to ensure everyone gets it at the same time
Reusability  Trainings can be delivered over and over again without a reset period; content can be repurposed for other trainings
Flexibility of use  Utilize all of pieces of a platform, use it for all or part of the training, use it for different types of training, hosts modules, pathways etc for different types of content.
Interactivity  Include audio, video, slides, Prezis, responsive tests and websites
Adaptability Depending on the platform, content can be changed (scaled, updated, amended, appended) to fit into other trainings/programs; , provide dynamic content that responds to learners needs

The absolute most important thing you must remember is that these benefits are conditional. They aren’t guaranteed and are heavily reliant on your current resources, your company’s infrastructure, setup costs – including training of internal users and onboarding – and time constraints, among a volley of other factors. For these reasons, content rather than delivery method should be the deciding factor in whether a piece of a technology should be incorporated into a program.

Telling Ain’t Training – Pt 5: Training in a Collaborative Workplace

I’m going to tell you a not-so-secret. Training adults is a game of social circles and politics. At one of my employers, training required a lot of buy in from different groups and participants generally wanted to fell like they were actively contributing to the event, rather than be on the receiving end of a final product. The dynamic can be challenging – how can you have a training if the people being trained believe that they know all there is to know?

There are a couple of things at play here. In today’s business environment, it’s all about your title and your tenure. If you’re not in a management position, it can be hard to get people to follow your lead. I’ll save my leadership lessons for later but for the purpose of this post, we’ll focus on the idea of collaborating with participants to deliver a successful training.

Chapter 8 of Stolovich and Keeps’s Telling Ain’t Training presents 25 scenarios you can use to add practical application to your trainings. I was planning an upcoming workshop around managing student concerns on campus and was excited to try some of them out. I flipped through each of them, eager to try something new. As I skimmed the exercises, I realized that none of them would work for me.

Why?

Because our team, dispersed across 15 cities and 3 continents, knows what they’re doing and they don’t want someone trying to get them to learn something by rote. What this book presents is, in it’s truest form, training. Reading through the scenario, I realized that what I wanted was a workshop. I wanted an event that had true learner participation and had a tangible end result.

I ended up with the a format that was predominantly learner led with me giving confirming and/or corrective feedback and taking notes when someone brought up a suggestion that aligned with best practices. In addition to having participants share out and complete two practical application exercises, I also asked them to ‘help’ me come up with a guide that others can use to apply the standards set during this workshop to any situation.

Participant feedback was overwhelmingly positive and I felt confident that they would be able to immediately implement their learning in their day to day responsibilities. Further more, the deck is available for reference and a recorded version will be made so that remote campuses will be able to provide the workshop asynchronously.

Telling Ain’t Training – Pt 4: Confirming and Corrective Feedback

A Familiar Scenario

Imagine you spent an entire weekend writing a paper for your Instructional Design course. It’s a lot of work and you’re unfamiliar with the content. You dedicate a few hours to reviewing the syllabus, assignment description and resources and you feel pretty confident in your final result.

When you get the grades back, you’re shocked to see a C+ next to your name. Under the feedback section, you get the following comment from your instructor:

“You’re just not getting it. Reread the diagram on page and then resubmit.”

As the learner, take a moment to jot down your internal reactions and external actions. I’ve shared mine in this chart.

External Actions Internal Reactions
Reread the syllabus Confusion
Revisit the diagram Frustration
Contact the instructor for clarification
Demotivation
Email classmates to ask them for help Resentment

Well, that escalated quickly, didn’t it? I’m sure the instructor didn’t mean to imply that I, the learner, hadn’t done my due diligence in reading all relevant materials, but that’s what it feels like. I might reach out to other participants to discuss my confusion only to find that they felt the same way. A picture begins to emerge – one of a miscommunication that, when repeated, can quickly snowball to a negative learning experience.

This example is applicable to any educational setting and is indicative of a few areas of contention that I have experience as both a learner and an educator. We’ll break the statement down to find out why it fails to be useful.

This statement implies that there is a flaw with the learner that prohibits them from grasping the content. It’s a variation of the old “try harder”, as if effort alone is all it takes to learn. Additionally, as described in a previous post, it can feel like a personal attack.


What’s not to love? It tells the learner where to look, which some might categorize as a helpful hint. The problem here is that the instructor doesn’t acknowledge their responsibility to expand on or further clarify the instructions. If the learner stared at the diagram for thirty additional minutes, would that somehow influence her comprehension? Should the instructor provide more context or other support to ensure the learner understands the content and that he/she/they remains motivated throughout the course or session?
The answer is yes, that is exactly the hallmark of a good educator and, if you’re prepped appropriately for the topic you’re teaching, it doesn’t take much to make adjustments.

Corrective and Confirming Feedback

Using the same scenario as above, imagine if you received this feedback instead:

Not quite, but you’re on the right track. You’ve done a good job of explaining x but y is missing. I recommend reading resources 1 and 2 again and using z to frame your answer.”
It takes a few more words, sure, but it accomplishes a few things:
  • Sets a positive tone
  • Call out of what is right (because no one likes to be wrong all the time!)
  • Calls out areas of improvement (after the praise)
  • Suggests concrete ways to improve
  • Provides additional resources and/or context

Using a combination of corrective and confirming feedback empowers students to explore topics independently, while looking to instructors/facilitators/trainers to provide guidance and support. This doesn’t mean that you, the educator, needs to handhold, coddle or give all of the answers away. Instead, it shows that you respect your learners and their ability to learn in ways that are best for them, as well as showing your support for their educational journey.

Think of the last time you learned a new and complex topic. If someone had offered guiding tips and suggestions that help you relate the content to something you already know or frame it within the context of your current life, wouldn’t that have made the experience not only more enjoyable and but effective?

What do you think? Have you used corrective and/or confirming feedback? What have been the results? As a learner, what type of feedback are you used to receiving and how does it influence your learning experience?

Telling Ain’t Training – Pt 3: Training the Right Way

Overview

The first two posts in this series talk about what we generally encounter as trainers – what we might define as failures in ourselves or our learners. I also cover a few techniques you can quickly in implement for existing trainings or those instances when you need to supplement content. This chapter and post focus on building the correct foundation to avoid those altogether.
Take a moment to consider the quote above.
What does it mean? In essence, there are a set of guidelines that allow us to build effective trainings independent of the learning styles of participants. Let’s take a look at what they are.

Six Guidelines for Creating Successful Trainings

The Why We’ve talked about this quite a few times but it bears repeating. Learners need to know why they’re learning the content. If he/she/they places high value on the training and content, they are more likely to engage and retain information.
The What Do you know what you’re teaching? Can you articulate this what using specific learning objectives? They should be listed on the course description. Maybe on the syllabus or in the classroom. This sets of end goal for you and your learners.
The Structure “Humans seek order (pg 75)” is something I have come to realize working in operations and even more so as an educator. Order allows participants to quickly grasp patterns as well as connect previously held knowledge to newly-learned information.
The Response How do you plan to add interactions your sessions? Response refers to the way in which learner’s respond to learning the content you’re presenting. According to research, as well as Stolovitch and Keeps, this can take the form of “answering a question filling in a blank labeling something solving a problem making a decision or even discussing and arguing (pg 76)”.
The Feedback Feedback is information that learners receive about how on or off target they are. It comes from the facilitator or instructor, or from other environmental components (e.g. think about a chemical reaction during a science experiment or a red ‘x’ or green check mark during an online quiz). Research indicates that feedback should be immediately relevant to the task. Personal criticism, perceived or otherwise, decreases performance. Additionally, it should also be timely frequent and specific.
The Reward What does the learner get for successfully completing a task? Rewards work the same way as they did in childhood; they motivate learners to continue a desired behavior. The actual reward will vary but as long as a reward is perceived as valuable to the learner it will be a successful tool for a motivation

 

Telling Ain’t Training – Pt 2: Blame Isn’t the Solution

You’re Really Good at That! (Or How You Become a Trainer)

When I took my first position as a trainer over 10 years ago, I had no idea how complex it would be. In hindsight I can see how ill-prepared I was to create meaningful trainings. This isn’t to say they were terrible or ineffective, or that I wasn’t good at my job. It does mean that I had a lot to learn. Like many of you, my first job as a trainer resulted from being identified as an ‘expert’ in what I did. The criteria for this varies, but in most cases, your manager is impressed by the work you’re doing or you’ve been doing it so long that you know all the ins and outs of a system, process or business. In some industries you’re called a subject matter expert (SME). In others, your title might include Lead, Head or another similar attribute. These are words that signal that you’re well-versed in your craft. By being labeled any of these, you’ve been selected to pass your knowledge and methods on to the ‘next generation’ or maybe even your peers.

I’m Good at My Job, I Swear

You’re excited and you have tons of ideas, tips and tricks to share. You sit down and think about all of the things that make you good at your job. The final outcome is a training outline or version 1 of a training manual. Armed with your knowledge and any class resources you’ve put together, you deliver your first training session and…you bomb.
Your audience is confused, you’re frustrated and everyone is resentful of what they perceive as time wasted.
You take a moment to reflect, attempting to figure out what went wrong. I’ve been there, believe me. In these situations, our minds can start to use blame as a way to rationalize everything.
Maybe I’m not that good at my job or perhaps Simon just wasn’t trying hard enough. Our brains lead us to believe that somewhere along the lines, someone messed up.
This isn’t entirely untrue, but it assumes that the failure was a result of a person, rather than a system. If we remove you (the person) and the learners as the root cause of the failure, then you are left with content and delivery. Let’s start by looking at the three critical components of communication when designed a training.

Where’s the Breakdown?

The first two, defined by Stolovitch and Keeps as declarative and procedural knowledge, constitute the majority of your content. The third is the idea of Adult Learning principles and it deals with how you choose to relay those first two components to your audience.
  • Procedural: You process customer returns everyday for 2 years. It’s second nature to key in code 555 in order to bypass the three standard welcome screens. It takes nothing at all to complete the entire logging and refund event in 3 minutes. This is procedural knowledge. You can think of it as all of the manual tasks you can complete without thinking about it. For me, a good example is knitting a washcloth.

Take a moment and think about a task that you can do without putting too much thought into it. This excludes things like breathing and blinking!

  • Declarative: Now if I asked you to talk me through, how accurate do you think your first try will be? How long to do you think it will take you to describe it so that I can replicate the process flawlessly? The is called declarative knowledge.
Many experts (like you and me) rely on procedural knowledge to do our jobs. Translating procedural knowledge into declarative knowledge often breaks down, which can lead to that familiar feeling of failure for both parties.
To further complicate this, in trying to teach others to work the way that we work, we sometimes forget that adults have their own feelings and experiences related to learning new things. We can’t just say ‘do it exactly this’ and not expect push back, especially from those who have already done the job before.

 How Can I Improve My Trainings?

  • According to the authors of Telling Ain’t Training, an effective way to set the stage for your trainings is to address positive and negative experiences related to the topic at the start of the session. This will:
    • Diffuse a situation by acknowledging past experiences of the participants and
    • Allow you to gauge what participants know and how they approach problem solving.
  • Establish the why and how of the trainings. As discussed in other posts, relevance is extremely important to adult learners. They need to know why they are learning this content and how it can be immediately applied to make their jobs easier and/or more efficient. You also want to discuss how they’ll be learning to do these things.
  • Here’s where the Adult Learning Principles come in:
    • Integrate real world scenarios as a way to demonstrate practical application to your learners. This also allows them to participate and share their experiences in similar situations.
    • Include a resource list that participants can refer back to. This can include detailed (or simplified!) explanations of content covered during the course, or further reading related to the topic.
    • Consider Task Mapping – During an Instructional Design course I took in graduate school, I chose to create a training for managers based on their job description. The posting included things like ‘budgeting’ but because many people are promoted into the position, it’s not an inherent skill. By task mapping, I was able to uncover assumed prerequisite knowledge and build from that level before moving onto budgeting; for example, an understanding of financial terminology, a intermediate grasp of Excel and formulas and understanding of profit margins.

The next time you design a training, clearly define what you need to teach your learners and then use the suggestions above to decide how you’re going to do it. Throughout the process, be conscious of what you’re doing vs what you’re saying when outlining or developing content. Read it out loud to yourself and, if possible, ask someone else (a non-SME) to follow along. If they can’t, rework the areas where they get stuck until you have a resource that is thorough but easy to understand and follow along.

Stay tuned for my next post which explores the best ways to build a strong foundation for all your training needs!

Program Evaluation in Adult Learning Environments

A Little About this Project

Although this case study focused on my role and the product my company was selling, the key components of evaluation remain the same. As you read through this article, consider how you currently evaluate success in your role and in your company. How does it compare to what is discussed below? How robust is the evaluation process at your company? Is there any room for improvement? Ready, let’s start!

Evaluation – Defining Anticipated Outcomes

The evaluation is possibly the most critical aspect of running a program. Because not all problems can be adequately addressed while a program is running, it is imperative that an evaluation process is built into the creation of any program, regardless of the length. This allows you to gather participant data that’s outside the scope of personal success and instead gather data on other equally important components of what makes a program successful. Soliciting this information during or after a course allows program stakeholders to revisit their initial assumptions about what the course or program should accomplish and what it takes to reach those goals. By committing to the creation and implementation of an evaluation plan, you are committing to the continued success of your program.

Philosophy of Evaluation

Whereas assessments should focus on the role of the instructors in ensuring student success, the evaluation should be thought of in terms of gauging program health. The in-class assessments let us know if participants are grasping content and if not, what we can do to adjust our approach. There are other aspects, such as whether the program meets participant expectations, that are not captured through project rubrics or other classroom assessment techniques. This is where the concept of evaluation comes in.

Once complete, it’s important to review the program in its entirety to discover whether the program has served its purpose, whether there are areas of improvement that can be addressed prior to the start of the next program run and whether the program, in its current iteration, provides value to stakeholders. To do this, evaluations need to be:

  • Objective: You, the instructional designer, facilitator or other stakeholder, have invested a substantial amount of time in developing this program. It may be hard to hear that it’s not working the way it’s supposed to. It’s important that fear not keep you from asking those hard, direct and neutral (non-leading) questions.
  • Actionable:Regardless of the way it is collected, evaluation data must be actionable. At times, we may fall into the trap of asking participants completely subjective questions that do not have clear action items associated. Conversely, asking objective questions that only allow for a yes or no may not give you any useful data around suggested improvements.
  • Targeted: Know what you’re evaluating ahead of time and why. While you may be tempted to ask participants how they felt about the location of the school, unless you have the ability to change that, it may be a waste of time – yours and the participant’s – to include it. Questions related to program goals, instructional quality, course content and overall satisfaction all fall under the category of things your stakeholders will want to know. A rule of thumb is to determine key health metrics at the outset of the program and ask questions that speak to those items.

Which aspects of the program are being evaluated and how?

As discussed, what you intend to evaluate should be established prior to the start of the program. This is especially crucial if this is a pilot, as it is likely that there will be areas requiring immediate refinement. For this course, four core areas were identified. In order to determine whether this course was effective, we will evaluate whether participants:

  • Are learning what we expect them to learn
  • Believe the content is delivered in a way that is accessible and easy to understand
  • Can make clear connections between the content they are learning and the content’s relevance to their everyday lives and,
  • Would recommend this program, including the setup, content and instructors, in the future

Feedback will be collected via exit tickets (surveys), administered by the instructor at the end of each session. These surveys will consist of four questions. The exit tickets are meant to be a section by section snapshots evaluating the immediate impact of the course. At the completion of the course, a more comprehensive end-of-course survey will be administered.

Sample Evaluation #1 – Exit Tickets

Purpose Evaluate the quality of content and instruction at the end of each session
Deliverable Survey (hosted in Google docs) – the same form is used throughout the course, differentiated by the submission date.
Questions
  1. In one sentence, summarize the most important concept you learned today (Open text)
  2. The material covered during this session is immediately applicable to my job (Likert scale)
  3. The material covered during this session was presented in a clear and approachable manner (Likert scale)
  4. Is there anything else you’d like to add? (Open text)

Sample Evaluation #2 – End of Course Survey

Purpose Evaluated the quality of content and instruction of the overall program
Deliverable Survey (hosted in Google docs)
Questions Overall Experience

  1. Would you recommend this course to a friend of colleague? (NPS)
  2. What is the most important reason you gave us that score?

Content

  1. The material covered throughout this course met my expectations (Likert Scale)
  2. The material covered throughout this course was relevant to my job (Likert Scale)
  3. The material covered throughout this course is immediately applicable (Likert Scale)
  4. I understood what the requirements were for all of my assignments (Likert Scale)
  5. Is there anything you’d like us to know about the content? (Open text)

Delivery

  1. The instructor presented in a way that made the material easy to understand (Likert Scale)
  2. The instructor provided meaningful feedback (Likert Scale)
  3. I had the tools and resources I needed to participate in class and complete assignments (Likert Scale)
  4. Is there anything you’d like us know about the delivery (Open text)

 

What happens to the data?

During the Course

During the course, data collected from the exit tickets will evaluate whether the content is relevant and immediately applicable for participants. In addition, it will evaluate delivery techniques and allow program facilitators to provide supplemental materials in areas where the content is lacking. In the event that the issue lies with instruction, alternative methods can be explored, including but not limited to the introduction of blended or self-directed learning tools, additional AV equipment or even a change of venue. The purpose of the evaluations at the end of session and end-of-course is to cover as many modifiable external components as possible and establish a plan of action to be implemented during the current cohort or for the next.

After the Course

All data will be aggregated in an Excel file and used to create a dashboard that identifies trends, weaknesses and strengths of the program. The exit tickets and end-of-course survey responses will be coded and categorized into Content, Delivery, Tools and Environment and then into stages based on the length of time it will take to implement. Quantitative and qualitative data will be used to drive decisions around revising content and/or delivery methods and whether any additional financial investments will need to be made for tools or program development. Finally, data collected will inform what kind of training, if any, would benefit the instructor based on feedback around delivery. A timeline will be established for all improvements and the changes will be communicated to stakeholders, as well as future and former participants.

 

Community Spotlight: Grace Institute

Introduction

Grace Institute is a non-profit organization that has provide job readiness training to women for more than one hundred years. Originally a school run by nuns, the agency has since transitioned into a workforce development program geared towards the current job market. The program is tuition free but requires a 20 week commitment, Monday through Friday from 9:00am to 4:00pm.

Participants receive training in 4 core components and spend the remaining time interacting with industry professionals who volunteer their time. The women in this program are from all walks of life – some had full times for 30 years and were suddenly laid off with no plans for the future. Some of the participants are underemployed and want to find a career, rather than a job.

It’s important to note that Grace Institute focuses on job and career fit over placement. As such, they teach participants self-evaluation and reflection. Volunteers from partner corporations come in to speak to them, both to give them an idea of where their skills can take them, but also to open up new thoughts on career paths. In institute also places strong emphasis on adult learning strategy, structuring the environment and classrooms to incorporate collaborative learning, open dialogue and critical thinking.

Agency Overview

Grace Institute was founded by W.R. Grace, an Irish immigrant, in 1897. When he began running his business, he employed many immigrants. The wives, sisters and daughters of these employees generally lacked the skills to compete in the job market. The institute was created to provide workplace skills for these women. Originally run by nuns, the organization has always been geared towards practical skills – in the early 1900s, this was sewing, cooking and other domestic tasks. As the marketplace changed, so did the course offerings.

Grace institute is focused first and foremost on providing immediately applicable job skills to women in need. The courses cover both hard and soft skills with the intention of helping women find long term and fulfilling careers.

Formerly a school, the agency has made a conscious shift towards being recognized as a workforce development program. As part of this shift, the attendees are known as participants instead of students. The program itself is run as a business casual corporate environment – from the clothing, to the settings to the fact that the women clock in every morning, all in preparation for working in a professional setting. One of the participants described it as a “completely holistic approach”.

Changing with the Times

Up until this year, the 136 participants were split across 4 sections, all receiving the following 4 classes; Business Writing and Communication, Office Technology, Keyboarding, and Professional Development.  The necessity and set up of each course is continually evaluated to determine whether the skills are still relevant in the current job market. At the moment, Grace Institute is implementing Salesforce to track job placement and retention among alumni. They intend to use that data to modify current courses or develop new ones. This year, the agency began a pilot program in one of the 4 sections to prepare selected participants for the role of Patient Services Representative (PSR) in the Healthcare field. By evaluating the job market, the Directors determined that there was potential for better job placement by focusing training to meet the needs of this field. To join the PSR section, current participants must sit through an application and interview process, much like when they were first admitted to the program.

The institute is tuition free, save for a $75.00 registration fee and a $75.00 books/technology and materials fee. There is a strict admissions process to ensure that the women that are accepted will be committed to successfully completing the program. Grace Institute is a non-profit that does not receive any government funding. To remain tuition free, the Grace family accumulated a sizable endowment which is now approximately $30 million dollars. Until 2012, the organization was pulling 100% of its operating costs from this. Since then, they have hired Jessica James, Director of Development, to cultivate relationships with business and procure grants from organizations such as The Robinhood Foundation, The Pinkerton Foundation and The Blackstone Group. Additional funding and support comes from corporate partners. These businesses provide volunteers who come in to teach business skills or discuss their path to success. They’ve even had members from Google come in for a day long team building exercise that taught IT skills to the participants. Recently, the agency received a $100,000.00 donation from American Security that financed a new computer lab, including approximately 100 computers. The hope in building these relationships is to raise donations, provide industry insight to participants and allow networking opportunities that may lead to employment opportunities. Last year, Macy’s place 25% of their graduates in Merchandising roles.

Instructional Overview

The tour was led by the Director of Development, Jessica and two participants on the path to graduation. These women, Tolou and Rose, were passionate about the program and effusive in sharing the impact it had has had on their lives. There were also two representatives from another organization, ParentJob.net, who were touring Grace Institute in hopes of establishing a partnership.

Walking into the classrooms, it’s easy to see that these are all mature, professional and dedicated women. Even without an instructor, they are all working on something, no one is on their phones and they were immediately engaged when we walked into the room. Rose and Tolou were eager to talk about the projects they’ve been working on and there were examples of collaborative learning projects on every wall.

The women is this program generally range from 18 to 65 in age. Previously, one of the 4 sections was comprised of all young adults, who make up about 30% of attendees. With the addition of the PSR section, that group has been integrated into the other sections, increasing the diversity of age across the remaining groups. Regardless of section, all participants take the same core classes together. Age, race, language and past work experience vary greatly and has very little bearing on placement. There are lawyers, teachers, stay at home moms, and accountants. Some of the participants are fresh out of high school. Some, like Rose, have been employed for 26 years but were unhappy, and some, like Tolou, had been out of the job market for over 10 years and found that her skills were sorely outdated.

While we stopped into a few different classrooms, only one of them was in session and we were unable to speak to the instructors. We observed the first day of the PSR course, which was an overview of what the field and the position were. Once completed, PSR-track participants should be fully equipped to provide customer care in settings such as hospitals, urgent cares centers, and medical offices. Grace Institute is partnered with Weil-Cornell, LIU, and New York Presbyterian and has recently partnered with CityMD for job placement.

Based on the conversations prior to the classroom tours, I was surprised to see the instructor using a PowerPoint presentation to deliver initial information. The first slide we were shown was a diagram of all the tasks associated with someone in a PSR role. She briefly went over the content while participants took notes, but there wasn’t any open dialogue or question/response that would get participants actively involved.  Although she related the information to herself by saying, “At my core, I am a PSR and I have PSR duties”, she didn’t elaborate further or provide any personal details about the job she does. However, we only observed the lesson for a few minutes, so I cannot fully evaluate the facilitation method.

The course that I found interesting, and that the participants said they found most valuable, was the Professional Development course. Tolou and Rose spoke passionately about how it affected their way of perceiving themselves, others and the world around them. It’s unfortunate that we weren’t able to observe it as I think it would provide a better representation of the agency’s prescribed method of teaching adult learners. Below, I briefly detail some of the more interesting aspects of the 2 of the courses and how they are taught.

  • Keyboarding – This class is taken in a group setting but it is differentiated per learner. A typing test is given at the beginning of the program to determine the participant’s average typing speed (WPM). A goal is then set to add 15 WPM to that initial number, i.e if you began at 40 WPM, by the end of the program, you should be at 65 WPM. Participants use a software that gauges where they are and then helps them to reach their goal. This allows students to go at their own pace, eliminating frustration and undue competition.
  • Professional Development – This class provides many soft skills needed in the workplace but rarely discussed. They cover etiquette, verbal and non-verbal communication, self-confidence, and cultivating your business persona. To prepare for job searching at the end of the program, participants engaged in a mock interview with 45 volunteers from different companies, who provided them with valuable feedback.

In terms of participant evaluations, every class provides tests, and at least 2 progress reports. Technology courses also use ProveIt.com to assess progress. There are two full time social workers on staff to provide counseling to participants who are struggling either in the program or in their personal life. If it is determined that she won’t be able to complete the program, she is counseled out. The average program completion rate is 75%, with last year exceeding that at 87%. Their processes seem to work – 80% of their graduates are place within 1 year.

Closing Thoughts

This visit left me extremely impressed with Grace Institute and what they have accomplished. They have been in operation for over 100 years and they have evolved to continue their mission in an ever changing world. It isn’t easy to balance providing a much needed service to communities and operate as a business, but they do it well. I think this starts with their mission as a workforce development program and their admissions process. From the very beginning, participants are given an orientation. If they apply, they can expect to be interviewed, take aptitude exams, pass the TABE test and be evaluated for risk factors to determine if this is the best time and/or fit for them. If an applicant is not admitted, they are referred to other agencies that provide similar services. While they can’t accept everyone, they provide motivation and resources to anyone who comes to their doors.

The other take away for me was the level of support provided by the staff and fellow participants. I have been very interested in building positive and open learning environments and that feeling is prevalent at Grace Institute. There’s a floor to ceiling whiteboard in the halls where participants write motivating messages to each other. Every Friday they have something called Friday Forum where they watch TedTalks, discuss their growth and ask questions. Every Tuesday and Thursday, they have something called ‘Food For Thought’, wherein a corporate volunteer visits to give insight into his/her field. There are open computer labs, free periods to catch up on work, and a community center to catch up with fellow participants. The women here are treated like adults in a working environment and, regardless of their background, their experiences and opinions hold value and are respected by everyone in the organization. So many of the things I witnessed have already gone in my toolbox and the visit has introduced me to a completely new way to approach workforce development.

Exploring Diversity to Engage in Meaningful Conversation

Diversity is more than a word; it’s a conversation. It requires definition, context, perspective. It has been applied to groups, to initiatives, to companies, to food, to ecosystems. Diversity can and does mean so many different things that in this day and social media age, it’s all but a prerequisite to unpack the term before you can ever begin talking about it.

I took a multi-culture and diversity course while completing my graduate degree. The first assignment consisted of completing a 2 sided diversity wheel that asked about you as a person and you as a construct. I know, I know, but stick with me. We were paired randomly with a partner and then asked to share as much as we felt comfortable sharing. At the end, we wrote brief introductions of our partner and shared it with the rest of the class.

I learned a few things about myself, things that I’ll likely discuss in another in-depth post. More importantly, I learned a great deal about the assumptions that are made when talking to other human beings. For example, by some accounts, our species is all we have in common. By others, looking alike is all it takes to be alike.

As a member of my company’s ERG program, I was finding it difficult to get real, deep, and frankly uncomfortable, conversation going. We were going in circles, talking about ‘us’ as women, as one homogeneous group that shares the same experiences. To some degree, that’s the truth. I’ve been cat-called, I’ve been judged for my gender, I’ve been challenged because of it. I also know, however, that my experience as a woman of color, as a person who is on a different education or career track, as someone who grew up in NYC, means that I don’t necessarily share the same perspective as the woman sitting next to me. This doesn’t make one of us better than the other but ignoring these differences can divide us.

Below is a workshop I delivered in a small group setting. I’ve expanded the initial exercise to open up conversations about what diversity and culture are, as well as to have participants identify their own knowledge gaps and look for ways to further explore them.

Telling Ain’t Training – Pt I: Understanding Your Learner

What’s Your Approach?

When designing trainings, how often have you considered the learner? And in what capacity? Do you think about your delivery method? What about the classroom environment? A dozen things might go through your mind as you mark off the unconscious checklist but let’s take a moment to think a little different about what training means and what it should accomplish.

Telling Ain’t Training starts with a few key points centered around understanding your learnings before they even step foot into your classroom, chief among them the tenant that we should be building trainings for the needs of the learners, investigating their roles, responsibilities and prior experience in order to build something meaningful and relevant for them.

What Do You Want to Accomplish?

According to the authors, what we do falls into three categories:

  • Training – Is the goal to teach participants how to complete a step-by-step task?
  • Instruction – Is the goal to teach participants how to react in a situation with one or more variables?
  • Education – A culmination of life experiences and learning principles that go beyond reproducing or inferring; the road to expertise.
The purpose of training, instruction and education is to transform the learner, not transmit data. We want the learner to be able to apply what has been communicated and not just repeat it back.

Find Your Center of Focus

There’s a mantra repeated at the beginning of the book, educators must be “learner centered, performance based.” This encompasses not just your delivery but the content you build, where you build it, and how you interact with participants. Lose sight of this and you risk losing your credibility and your learners’s interest and respect.

Learner Centered Means Adapting

How we learn is part of our genetic make up. Garden’s Theory of Multiple Intelligences tells us that w need to negate different senses and learning types in order to really make teachings stick. The question is how to you cater to an audience you’ve never met. Training can take advantage of what we know about the human body to build flexible courses that can be modified or already integrates best practices for engaging many types of learners.

Think About It!

Humans can store massive amounts of data. The issue lies in retrieving it. Assuming that it’s relevant to the learner, then organization is the key. Consider the acronym PEMDAS and the mnemonics My Very Educated Mother Just Served Us Nine Pizzas. Can you remember what they mean? If the answer is yes, when was the last time you needed to use the information? Chances are you haven’t consciously thought about either in a long time but the information still lives on. That’s the power of organization coupled with effective teaching and the human brain.

This, of course, doesn’t mean the classes where we learned these pieces of information were successful but rather that someone stumbled upon a great memory technique that may or may not have translated into other parts of the curriculum. I can’t readily recall most of what I learned in Earth Science but I vividly remember Algebra. The teacher included hands on and group activities, employed a reward system and used visuals and audio cues to draw connections between prior knowledge and newer, more complex pieces of information.

Click here for the next post in this series where we’ll find out why trainings fail.