With the calendar and the weather clearly set to January, spring seems a long way off, but it’s fun to think about. Longer days. Warmer weather. A time of renewal. And it’s a good time to try out new tech. You’re almost six months into the academic year and you’ve gotten a good feel for what’s working and what’s not. You have an idea for what gaps need to be addressed for next year. So let’s make intentional use of the spring months to provide proof of concept for any new technology before implementing it school- or district-wide in the 2019/2020 school year. Read on for some great tips about the best ways to run an effective pilot. Then see some examples of cost-effective classroom technology pilots.

If you have a few classroom tech ideas on your list of things-to-try, Macro Connect can help you formulate a pilot program that produces real, actionable findings for an evidence-based decision. Give us a call or email us for a free consultation on pilot design.

Spring Is the Time to Pilot

You don’t need 9 months of data to make a great decision. Usually 2 or 3 months will do if you’re purposeful in designing and implementing the edtech. But when to do it? Fall is full of urgent tasks like Count Day and getting the year off to a good start. Winter is cut up into pieces by a long Holiday Break and sometimes a Mid-Winter break. That makes spring our favorite time to pilot. As we mentioned before, you’re in a great rhythm, you have a good feel for your current gaps and teachers and administrators have had time to think about new ideas. So spring it is!

New Technology Piloting Key #1: Establish Metrics and Define Success

It is so tempting to jump right into a test, assuming you’ll know by observation whether or not it worked. Plus every tool comes with a fancy reporting dashboard with plenty of data points, right? They do, and that’s part of the problem. Vendors aim to impress with A LOT of data and analytics. But the mountain of data should not distract educators from the metric they’re looking to move. In order for a pilot to be effective, you’ll need to take some time at the beginning to think about what’s most important to you. Is it reducing teacher prep time? Is it class participation? Is it academic performance in a subject area? This is not to say that the tool you’re piloting should or would even include that metric in its own reporting suite. Some of the most important measurements may exist outside the machine or software, so you’ll need to figure out how to track them yourself.

You’ll also need to decide on a reasonable expectation for proof of concept. Whitepapers and your friends at other districts can provide some insight into what use cases would move the needle in your area of interest. If you don’t have time to implement the tech to complete fidelity, make sure your goals meet the scale of your rollout. Relying on the quantitative data is key, because people can form strong, subjective, emotionally-based opinions that skew results in favor or against adoption. Keep your decision-making based on facts to make a rational decision. That will help you more effectively defend your position to the Superintendent, the Board, and/or parents, and most importantly, your ability to replicate results down the road.

New Technology Piloting Key #2: Establish Pilot Scale & Length

You could be 3 months into something that you think is going well, but others don’t, and find yourself extending it in hopes that people will come around. Set a firm start and end-date, to promote accountability in completing agreed upon milestones. This will also help you resist the temptation of a loose and flexible pilot that allows additional variables to affect the results. It will only prolong the decision-making process and make it harder to justify your final conclusions.

Before you begin, decide how many classrooms or schools will be part of this test. This helps you understand how long it will take to get the hardware or software installed and the users trained. The bigger the scale, the longer it will take. But too small a scale and you won’t have representative results. Choose a diverse range of users and environments that represent your student and staff body as a whole and you’ll set yourself up for an effective pilot.

You will also need to choose a pilot length that lends validity to the results. Going back to our time of year discussion, Macro Connect team recommends 2-3 months ideally in the spring.  Like March-May. That way you’re finished before the chaotic weeks as school ends. Establish and commit to the timeline for implementation, operation, and results gathering and then set things in motion.

New Technology Piloting Key #3: Select & Train Key Users

The selection of users requires a little more detail. Your selection of teachers and/or administrators who will be a part of your pilot is crucial to its effectiveness. Avoid bias where you can and pick a diverse representative sample. Remember that the loudest critics (or those who are calling for a new tool or a change) may be the users most susceptible to choosing emotionally. Those who are more tech savvy will really know how to push the system to see what it can do and where it might fall short. The risk is that they may not be the most representative of your teaching population. Those who are less tech savvy will give you an idea of adoption issues and the training required, but they may also prevent you from gathering enough data during the pilot.

Once you’ve found a group that captures a good mix of backgrounds, put every user in a good place to succeed. Provide the proper training so that they know what they’re doing from the get-go and give them the benefit of knowing your vision for utilization in the future. You want to make this pilot as close a simulation of a full-scale roll-out as possible. Help your users get in the right mindset. And remind them that this is, after all, just a test. Those who struggle with adopting new technology might give up on it or go into it with a bad attitude. Help them develop the positive perspective that will provide real, actionable results.

New Technology Piloting Key #4: Implement to Fidelity

It’s hard to stress this part enough. You’ve done the work to establish metrics, define success, set the scope, and train the users. Now you need to implement it faithfully. Follow your process according to your pilot plan and don’t waver. Just like a neurobiologist conducts an experiment according to predefined variables and conditions, you must do the same. Alter it half-way through and you’ll invalidate your results. Give it half-effort and you’ll throw away weeks of prep work, months of piloting, and any financial investment. Once it’s all over, you’ll be glad you followed the plan if you can make a reliable decision one way or the other. In areas where you don’t stick to the plan, be sure to document what happened and why so you can take these factors into consideration when evaluating the pilot.

New Technology Piloting Key #5: Gather & Evaluate Feedback

Once the pilot is over, collect your results-based data and see whether it’s a go or no-go. If the outcome was mixed, you can go through additional data, including qualitative feedback or other performance indicators that may have moved, to make an informed decision. Key questions to consider in a mixed outcome include: What variables could have impacted the validity of the trial? Were some users obstinately opposed? Were some users biased in favor? Did something unexpected change over the course of the pilot that threw things into disarray? Only you will know the answers to these questions, but they’re crucial to consider.

Misconceptions in Piloting EdTech

Macro Connect has run many small and large pilots for schools and districts of all sizes. Don’t get in your own way of running a proof-of-concept trial. The worst case scenario is going from demo to implementation, without validating the long-term efficacy of a major technology investment.

  • Misconception #1: Classroom hardware is too large to simply try out

Often districts put out RFPs and make major investments in interactive projectors, classroom sounds systems, and the like, without ever having their teachers teach with the technology! The reality is that most manufacturers (at least the good ones) will work with you to implement a demo classroom or design a small-scale rollout. For example, we are working alongside Audio Enhancement and Plymouth Educational Center on a 5-room implementation of sound reinforcement. This test of classroom audio will give the district the data it needs to decide if classroom audio is part of their long-term vision for instruction and the budget.

  • Misconception #2: My district is too big to rely on just a handful of users

Last Spring Macro Connect assisted Walled Lake Consolidated Schools in testing ClassLink’s Single Sign-On and technology usage analytics platform. The district handpicked its Instructional Technology support team to provide feedback, which had representation from each campus across the district.

  • Misconception #3: Server-side technology is too complex for pilots

Similar to misconception #1, the best vendors are willing to prove their worth by implementing their products right alongside their competitors. For example, SonicWALL will put an appliance next to your existing firewall to provide you with reporting on known cybersecurity threats over a 30-day period. Similarly, on the content side, SAFARI Montage will install a demo server at your head-end and let you use their content free of charge for 60 days.

Get Started Now

It may not seem like it, but spring will be here before you know it. Now is the time to start developing a pilot program so that you can implement it by early to mid-March. If you’d like some help designing a pilot or working with a technology provider to initiate a pilot, give us a call. Or if you just want to bounce a couple ideas around, we can help with that too. Invest a little time now so you can comfortably invest more in the future!