Tag Archives: AEA

Eval13: 3000+ evaluators, 800+ sessions, and way too little time to soak it all in

Last week, more than 3000 evaluators descended on my hometown of Washington, DC for the American Evaluation Association’s annual conference. I learned this much + slept this much = rockstar conference.

#omgMQP

I had the pleasure of spending Monday and Tuesday in Michael Quinn Patton’s Developmental Evaluation workshop. Due 10% to my bad vision and 90% to being starstruck, I sought out front-row seats:

Along with many other nuggets of gold, MQP shared the Mountain of Accountability, a simple visualization demonstrating a Maslow’s hierarchy for organizations. (Start with the basics like auditing, personnel review, and outputs; then progress to typical program evaluation; then progress to developmental evaluation and strategic learning.) This visual was a fan favorite; the ipads and iphones were flying around as everyone tried to snap a picture. Anyone else think that MQP would be a great addition to the dataviz TIG?

My biggest takeaway? Developmental evaluation is probably the future of evaluation, or at least the future of my evaluation career. Also, many evaluators wouldn’t call this approach “evaluation,” which means I’m not an evaluator, but an “evaluation facilitator.” Time to order new business cards!

#thumbsupviz

On Tuesday night I had Dataviz Drinks with Stephanie Evergreen, Tania Jarosewich, Andy Kirk, Johanna Morariu, Jon Schwabish, and Robert Simmon, along with a few more poor souls who had to listen to our endless enthuasiam about charts, fonts, and other things “worth staying up late for.” We’ve each been trying to reshape the dataviz community from one of frequent criticism to one of encouragement and peer learning (e.g., the Dataviz Hall of Fame.) A few beers later, the #thumbsupviz hashtag was born.

Stay tuned for our growing gallery of superb visualizations at thumbsupviz.com.

omg Factor Analysis…

On Wednesday I attended a pre-conference workshop about factor analysis. I learned the approach in grad school a few years ago, have only used it twice, and wanted to brush up my skills. The instructor provided a wealth of resources:

My biggest takeaway? Ouch. My brain was hurting. Leave the factor analysis to the experts because 99% of us are doing it wrong anyway. You don’t have to tell me twice!

Performance Management & Evaluation: Two Sides of the Same Coin

On Wednesday afternoon, I gave an Ignite presentation with my former supervisor and performance management expert, Isaac Castillo. Paired Ignites are rarely attempted, and I’m glad we took a risk. I had a lot of fun giving this talk. Stay tuned for future collaborations from Isaac and I!

Check out our slides and the recording of our presentation:

Excel Elbow Grease: How to Fool Excel into Making (Pretty Much) Any Chart You Want

On Thursday morning, I shared four strategies for making better evaluation charts: 1) adjusting default settings until your chart passes the Squint Test; 2) building two charts in one; 3) creating invisible bars; and 4) really really exploiting the default chart types, like using stacked bars to create a timeline or using a scatter plot to create a dot plot.

Here are the slides:

The section about dot plots was pretty popular, so I recorded it later:

I thought the presentation went okay, but afterwards, an audience member came up to me and asked, “So if I wanted to make a different type of chart in Excel, like anything besides a typical bar chart, how would I do it? What could I make?” “That’s what I just spent the last 45 minutes showing you.” “No I mean, if I wanted to make one of these in Excel, could I do it?” “Weren’t you in the audience for the presentation I just did?” “Yes, that would be a cool presentation, you should show us how to make those charts in Excel.” Thanks for the great idea buddy, I’ll submit that idea to next year’s conference. 🙂

East-coast happy hour

For the second year in a row, the east-coast AEA affiliates got together for a joint happy hour on Thursday night. Good vibes and familiar faces.

eval13_happy_hour

The Washington Evaluators, Baltimore Area Evaluators, New York City Consortium of Evaluators, and the Eastern Evaluation Research Society

The Conference is Over, Now What? Professional Development for Novice Evaluators

On Friday afternoon I led a roundtable with tips for novice evaluators. The discussion was awesome, especially the great chats I had with people afterwards. I’m going to write a full post recapping that session. Stay tuned!

How to Climb the R Learning Curve Without Falling Off the Cliff: Advice from Novice, Intermediate, and Advanced R Users

On Saturday morning I had the pleasure of presenting with a former teammate, Tony Fujs, and my new teammate, Will Fenn. Tony dazzled the audience with strategies for automating reports and charts with just a few lines of R code, and Will shared tips to help novices avoid falling off the learning curve cliff. Check out their resources and tips in this handout.

tony_will

Tony Fujs (left) and Will Fenn (right)

I thought the presentation went okay, but afterwards, an audience member commented, “It would be really cool if you got some evaluators together to show us what kinds of things are possible in R.” “Umm yep, that’s what we just did, Will and Tony showed how to automate reports and create data visualizations in R.” “Yep exactly, that would be a great panel, you could get several evaluators together and show how to automate reports and make data visualizations in R.” “Did you see the panel we just did?” “Yeah you should put a panel together like that.” Okay thanks, I’ll consider it. 🙂

Evaluation Blogging: Improve Your Practice, Share Your Expertise, and Strengthen Your Network

Dozens of evaluators have influenced and guided my blogging journey, and I was fortunate to co-present with three of them on Saturday: Susan Kistler, Chris Lysy, and Sheila Robinson. I first started blogging after watching Chris’ Ignite presentation at Eval11, Susan’s initial encouragement kept me going, and Sheila provides a sounding board for my new ideas.

awesome_panelists

Left to right: Susan Kistler, Chris Lysy, and Sheila B. Robinson

Can you tell we presented on Saturday morning?! Chris and I arrived early. I almost panicked, but instead Chris and I started laughing hysterically, and then a second person arrived. Close call!

empty_ballroom

By the time we started, we drew a good crowd of 30-40 bloggers and soon-to-be bloggers. Same time next year??

Evaluation Practice in the Early 21st Century

Where have we come from, and where are we headed? Evaluators have accomplished some amazing things, and the future is bright. Patrick Germain and Michelle Portlock, evaluation directors at nonprofit organizations, shared strategies for making evaluation happen when you are not in the room:

eval13_nonprofiteval

For me, the mark of a good presentation is when the evaluator shows vs. tells us something new. Kim Sabo Flores, Chad Green, Robert Shumer, David White, Javier Valdes, and Manolya Tanyu talked about incorporating youth voices into policymaking decisions. The best part: the panelists invited a youth participant to speak alongside them on the panel so that she could share her experiences firsthand.

eval13_youth_voices

They taught us about youth presence vs. participation, and then they showed us about youth presence vs. participation. Well done!

A dataviz panel shared a brief history of dataviz; strategies for displaying qualitative data; and ideas for using graphic recording:

One of many, many graphic recording examples shared by Jara Dean-Coffey

One of many, many graphic recording examples shared by Jara Dean-Coffey

The Innovation Network team is pretty fond of graphic recording too, and Kat Athanasiades even recorded an entire advocacy evaluation panel. Thanks to Cindy Banyai for capturing this awesome video!

And just in case you’re not familiar with my plans for our field…

Wave goodbye to the Dusty Shelf Report!

Wave goodbye to the Dusty Shelf Report!

Lookin’ good, Eval! See you next year in Denver!

Conference Tips for Newbie Evaluators

Wear that nametag like a rockstar

Are you getting excited for the American Evaluation Association’s conference next week in Minneapolis?

This week I’m sharing conference tips for newbie evaluators and first-time conference attendees. While many of these ideas will be obvious for experienced evaluators, these ideas won’t be obvious for your coworkers and mentees who are attending one of their first conferences. Please consider sharing these tips, along with your own advice, with the newbie evaluators in your office.

Before the Conference

Here’s what newbie evaluators and first-time conference attendees should do before an evaluation conference:

  • Assess your motivations for attending the conference.  Are you actively job-searching and want to conduct informational interviews during the conference? Are you considering starting your own consulting business and want to chat with independent consultants? Are you trying to meet others with similar interests so you can partner on future projects?
  • Start using Twitter and LinkedIn to meet people and figure out who you want to meet face-to-face at the conference.
  • Pack an extra casual outfit and an extra business outfit. Multi-day conferences tend to get more casual over time; that is, evaluators seem to wear suits the first day, then business casual for a few days, and nice jeans on the last day. Exception: If you’re presenting, save your favorite, nicest outfit for the day you’re presenting.
  • Don’t forget your electronics. It’s easy to buy a replacement toothbrush from the hotel’s convenience store, but nearly impossible to find a replacement charger for your laptop or phone.
  • Bring cash for taxis, tolls, tips, and parking meters. You’ll also need small bills ($1 and $5 bills) for group dinners. (Ever try to split a check between 20 evaluators with 20 different credit cards? It takes longer to pay the bill than eat dinner!) I usually pack $100 in small bills.
  • Find out what you’ll be reimbursed for. For meals: Does your organization use a per diem system or a reimbursement system? For alcohol: Alcohol’s typically not covered. For hotel internet access: Unless you’re working during the conference to meet a client deadline, your organization probably won’t pay that extra $10/day for internet. Instead, scope out the free wifi hotspots in the convention center or conference hotel. For airfare: Is there a limit on your airfare? Will your company reimburse you for baggage fees? For ground transportation: Will your organization reimburse you for taxis to/from the airport, or just shuttle buses? For your timesheet: Will your organization provide a charge code for the entire conference, or just the first few days? It’s not uncommon to work “on your own” on the last day of a conference, especially if it falls on a Saturday.
  • Find out which organizations will be represented at the conference. Even if you’re not job-hunting now, start brainstorming who you’ll want to work with in 5, 10, or 20 years. I like to skim through the back of the conference program.

During the Conference

Here’s what newbie evaluators and first-time conference attendees should do during an evaluation conference:

  • Iron your clothes and hang them in your closet as soon as you check-in to your hotel. Multi-day conferences are exhausting, and you’ll need all the sleep you can get over the next few days. I can’t tell you how many times I’ve overslept and ran to sessions wearing a wrinkled pair of pants…
  • Keep your conference objectives in mind. Take charge and get what you came for! Networking won’t happen unless you initiate it.
  • Don’t work during the conference, even if it means working weekends beforehand to clear your schedule. If you’re presenting, don’t work on your slides during the conference (again, even if it means working weekends beforehand to finalize your presentation). Conference time is precious and is better spent learning and talking.
  • That being said… Make time for self-care. Have quiet time in your hotel room, go to the gym, or take a walk outside.
  • Skip a session or two for networking. My most valuable lessons and tips about evaluation have come from casual conversations with other evaluators, not from formal trainings.
  • Trade business cards and jot down notes about each person to guarantee that you’ll remember them a few years from now (e.g. “Knows my friend Joe; lives in Baltimore but visits DC often; schedule coffee together.”)
  • Tweet! Use the conference’s official hashtag (#eval12 for the next American Evaluation Association conference and #eers13 for the next Eastern Evaluation Research Society conference).
  • Attend a Topical Interest Group (TIG) business meeting at the American Evaluation Association conference. This is a great way to meet current and up-and-coming leaders in the field.
  • Attend at least one session each on: 1) a brand new topic (to broaden your understanding of the field); 2) a beginner-level topic (to meet other newbies, and to feel more confident in what you already know); and 3) an advanced topic (to give yourself a reality check about what you don’t know).
  • Ask questions during sessions. Don’t doubt their project, methods, or findings, or you’ll get a reputation for being an annoying audience member. I recommend straightforward clarification questions: “This is really interesting information, thank you. I’m new to this concept/project/approach. Could you give us a little more background about xyz?”
  • Talk to presenters after their presentation. Don’t feel pressured to develop an elevator pitch or be an expert about their background or topic. Try this: “I learned a lot from your presentation, thank you. Where can I learn more about your work? Do you/your company have a website, blog, white papers, handouts, etc.? Do you ever visit DC? If so, would you like to have lunch the next time you’re in town?”
  • Don’t arrive to sessions late or leave early. It’s bad manners and distracting to the presenters.
  • Don’t worry about taking notes. Most handouts and slides will be available online right after the conference. My “notes” are my 140-character tweets.
  • Wear that nametag like a rock star.
  • And if you’re job-hunting… Print 100 copies of your resume and post them to the jobs board. Don’t share your resume with people you’re meeting for the first time unless they ask for it first.

After the Conference

Here’s what newbie evaluators and first-time conference attendees should do after an evaluation conference:

  • Connect with everyone on Twitter and with a few people on LinkedIn.
  • Send personalized, casual emails to 5-10 people you met. I’ve actually received emails with citations – “Hi Ann, I liked hearing about your work, it coincides with Smith (2010), who found a statistically significant difference between x and y. That is, until Miller (2011) employed a larger sample and found xyz….” Please don’t send anyone a literature review! Instead, write something like this: “Hi Ann, I liked hearing about your work. I’m going to apply xyz skill in my next project. Let me know if you’re ever in DC. I’d enjoy learning more over lunch sometime.”
  • Schedule coffee, lunch, and happy hours with evaluators from your city.
  • Share what you learned! Write on your personal blog, your organization’s blog, and/or the aea365 blog. Tweet. Lead a brown bag for your teammates. Bonus points: Explain what you learned to a non-evaluator (like a parent, roommate or significant other) without putting them to sleep. Translating interesting ideas from evaluationese into lay language is a skill that improves with time.

Have additional tips and tricks to share with newbie evaluators and first-time conference attendees? Please share your ideas below.

Thanks, Ann Emery

 

Additional Resources:

My favorite resources for job-hunting evaluators

It’s that time of year again! Within the past week alone, three of my evaluation friends have asked for my advice in finding positions. Here are my favorite resources for job-hunting evaluators.
———————————————————————
Resource #1: The American Evaluation Association’s public job listings for evaluators. These postings are free to the public. Just visit www.eval.org, and then click on “Career” and “Search jobs/RFPs.”

Resource #2: The American Evaluation Association’s weekly career center email digest. The weekly email digest is one of many benefits of an annual membership. Membership is $80/year (and includes endless perks, like hundreds of live and pre-recorded webinars). To sign up for the email digest, visit www.eval.org and log-in with your username and password. Click on “Career” and “Subscribe to Job Updates.” Then you’ll start receiving great email digests like the one below.
Resource #3: The Eastern Evaluation Research Society’s public job postings on the website. The Eastern Evaluation Research Society is a regional affiliate of the American Evaluation Association for evaluators on the east coast. A majority of the positions are located between DC and NYC. Visit www.eers.org and click “Job Listings.”
Resource #4: The Eastern Evaluation Research Society’s job emails for conference attendees. We send 1-2 job announcements per month as a special perk for our conference presenters and attendees. Most, but not all, of the jobs are located along the east coast.
Resource #5: The Washington Evaluators public job postings and members-only mailing list. The Washington Evaluators is a local affiliate of the American Evaluation Association for evaluators in Washington, DC. Although we post a handful of jobs on our website at www.washeval.org, the way to learn about evaluation jobs in the DC metro area is to join our members-only mailing list. Members get 2-3 emails a week with job postings, RFPs, and upcoming brown bags. At just $25 a year, this resource is a steal.
Resource #6: The Evaluation Jobs group on LinkedIn. Although this group just started in March 2012, it already has 600+ members and dozens of jobs-only postings for evaluators. This is a great place to find jobs and advertise (for free!) for openings within your own team.
Resource #7: The Young Education Professionals of DC mailing list. Visit http://www.youngedprofessionals.org/yep-dc-get-involved.html and sign up for the Google mailing list. The YEP-DC mailing list is an undiscovered gem in DC! They send emails about jobs, brown bags, happy hours, conferences, and trainings related to education and youth development in Washington, DC. A majority of the job announcements are directly related to education (i.e. teachers and curriculum specialists within DC’s public, private, and charter schools) but a good number of jobs are related to data, research, and evaluation (i.e. data coach or quality assurance manager within nearby school systems).

Resource #8: The Washington Post Jobs website.Visit http://www.washingtonpost.com/jobs/home. If you’re looking for an evaluation position within a large consulting firm, government contractor, or think tanks around DC, then this is the place for you! Westat, ICF International, Urban Institute, and American Institutes for Research are some of the frequent advertisers. This is where I found my first evaluation job as an external evaluator in a consulting firm. However, a recruiter at my old job told me that they typically receive 400+ resumes for every entry-level position that they choose to advertise through Washington Post jobs. Now’s a good time to check, double-check, and triple-check your resume for typos.

Resource #9: The Idealist website at www.idealist.org. If you’d like to be an evaluator within a non-profit organization or school system, then this is the place for you! This is how I found my job as an internal evaluator in a youth center. However, you’ll have to get creative with search terms. As shown below, when I typed “evaluation,” the results included advertisements for mentors and online engagement.

 

Do you have additional resources for job-hunting evaluators? Please share the good karma below.

— Ann Emery

P.S. If you’d like additional advice, check out my helpful hints for job-hunting evaluators.

 

 

Newbie Evaluator Essentials [Guest post by Karen Anderson]

Blog swap! Karen Anderson and I are mixing things up today by guest-posting on each other’s blogs.

Karen Anderson is my favorite “newbie” evaluator. Karen completed a master’s degree in social work a couple years ago. She’s currently an evaluator at a nonprofit in Atlanta and she’s the Diversity Programs Intern for the American Evaluation Association. In all her “spare” time, she’s doing pro-bono evaluation for the State of Black Gay America Summit organizers. And she’s a blogger!

You can read Karen’s LinkedIn profile here, and you can read her blog, On Top of the Box Evaluation, by clicking here.

I hope you enjoy Karen’s guest post.

— Ann Emery

———-

When I think about the “newbie” evaluator or the not so new professional to the evaluation field and the necessary knowledge base and skills needed to not only survive, but to thrive, I reflect upon Jean King’s Essential Competencies for Program Evaluators.

King’s Essential Competencies for Program Evaluators include:

  1. Professional Practice: These are the fundamental norms and values of evaluation practice, which include working ethically, applying evaluation standards, and considering the public welfare, which is further explained in the AEA Guiding Principles for Evaluation under Responsibilities for General and Public Welfare.
  2. Situational Analysis: The unique interests, issues, and contextual circumstances of evaluation.
  3. Reflective Practice: One’s own evaluation expertise and need for growth, which includes knowing self, reflecting on your practice, pursuing professional development, and building professional relationships.
  4. Interpersonal Competence: Do you have the people skills for evaluation practice? This includes negotiation skills, conflict resolution, cross cultural competence, and facilitating constructive interpersonal interactions.
  5. Project Management: King describes this as the “nut and bolts” of evaluation work” in the presentation above. This includes presenting work in a timely manner , budgeting, responding to RFPs, use of technology, and supervising and training others.
  6. Systematic Inquiry: The technical aspects of evaluation. What’s your knowledge base? Do you know qualitative, quantitative, and/or mixed methods? Developing program theory, evaluation design and evaluation questions are also major components of this competency area.

So in terms of the Essential Competencies for Program Evaluators, how are you doing? I wish I had some type of rating scale to help me to see how far I’ve come along. I’d have to say on the job training, webinars, and seeking out evaluation trainings, no matter how brief, like the online American Evaluation Association’s Coffee Break Demonstration Series has helped me to come a long way since my grad school days (2010).

Newbies: What are some “essentials” that you think are missing from above that relate to your evaluation practice growth and development?

Not so newbies: What steps do you take to sharpen your evaluation skills and to increase your knowledge base?

— Karen Anderson

Is the evaluation hurting the program?

We were extremely fortunate to have past and present Presidents of the American Evaluation Association as our guest speakers at the 2012 Eastern Evaluation Research Society’s conference – Eleanor Chelimsky, Jennifer Greene, and Rodney Hopson.

Even though the conference was a couple weeks ago, I’m still thinking about one of Rodney Hopson’s comments. He mentioned that sometimes he wonders whether evaluators/the evaluation are actually hurting the program rather than helping it.

I’ve certainly had similar experiences. Mostly, I’ve seen program staff get so excited about data that they want to collect more, and more, and then even more data. You can read about one of my experiences here.

This is a great idea at first. What’s the harm? More data is better, right?

But… a few months down the road, the program staff and I are swimming in more data than we can handle. And, we often have more data than we really need. After all, my goal as a utilization-focused evaluator is to collect information that will directly influence decisions about the program or the participants. Simple, quick, streamlined data can be more useful than complex, time-consuming data.

Have other evaluators felt like this? Have you ever questioned whether your involvement is hurting rather than helping?