Data and Decisions: Driving Change in South Africa’s School Districts

In this blog series, we will discuss the importance of the use of data in South Africa’s education system – why it matters to parents, learners, and educators, and how providing educators with the information they need can help learners excel. In this post, we discuss the lessons we learned around implementing data-driven programs in South African schools.  Click here to read all the blogs in this series.

Ready for a pop quiz? Here are three statements – two true and one false – about the availability and use of education data in South Africa. Can you spot the false one?

  1. Over 20,000 schools in South Africa have access to detailed data on their students’ learning and needs.
  2. The data was organized and submitted by the schools themselves – representing information for nearly of learners in South Africa.
  3. Valuable performance data is used to smoothly develop and implement data-driven programs across South Africa that improve student learning by focusing on the areas where they need the most help.

If you answered C, you have a great sense of where our work is currently focused in South African school districts. Thanks to a coordinated effort from the Department of Basic Education and school districts, many schools now have access to the important data they need to create highly impactful programs for their students – but successful implementation is limited to small pockets of activity dotted throughout the country.

Over the past five years, we have learned a lot about what makes the implementation of a data-driven program work – and what doesn’t. One of our Dell Social Impact Principles is: If it doesn’t work, tell everyone. Here are our lessons learned for implementing a successful data-driven education program:

  1. Don’t go shallow. Data dashboards (check out a demo version here) offer the opportunity for educators to see a variety of figures so they can make the right programmatic decisions for their districts, schools, and students. Unless we dive deep and work with stakeholders during the implementation process, we cannot know for sure how effectively the dashboard supports their decision-making. We learned that we must engage with stakeholders and dive deep into their individual processes if we want to understand their specific use cases and enable them to improve learning outcomes.
  2. One size almost never fits all. Once the dashboard prototype was developed, we were ready to see how schools and the Education Management Information System (EMIS) departments developed the programs they needed for their learners. We hoped that our broad solution could be easily adopted and implemented. But, the very nature of a data-driven dashboard – and consequently a data-driven program – requires good information technology infrastructure, which is not necessarily available across EMIS departments in South Africa. To enable scale and local adoption of the dashboards, we had to put aside our vision of efficient and speedy adoption to work with the people using the dashboards and deliver solutions that fit their technology realities on the ground.
  3. Quality, then quantity. Even the best data solution will not get traction if the underlying data is not credible. Collecting quality data is a monumental task with few quick fixes.  For us, early data collection was slow, manual, expensive and missing key data fields. This forced the program to develop a data validation and extraction tool that was deployed to fix quality at the source.  Although not a quick win, this work led to unprecedented gains in the quality, reach and timeliness of data collection.
  4. If you build it, they will come – maybe. Just because the dashboards made quality data available, doesn’t mean that educators automatically started to use it. As with anything new, we had to actively drive usage of the tools to see how our work catalyzed the behavioral changes necessary for impact. We created a change management strategy, provided training and coaching for users in data analysis, and created good product management practices. All these efforts aim to ensure the data is accurate, timely, relevant and easy for education officials to identify schools, subjects or learners that require additional support or targeted intervention programs.
  5. Measure mindfully. This is another favorite in our Dell Social Impact Principles because evaluation is at the heart of everything we do. We are currently working to help districts use their dashboard data to monitor and evaluate the impact of their work. With knowledge of what works – substantiated by data – education officials can adopt the interventions with the greatest potential to improve learner outcomes.

What’s next?

How do we move from increasing and enabling data access to driving more widespread usage? In the next blog in this series, we’ll explore the innovative ways education officials are becoming increasingly data-driven to improve education for all young students in South Africa.