The Youth Development Agency
The Youth Development Agency
  • In search of the ethical algorithm

    Let me take you to the future, no, unfortunately, I don’t have Doc Browns’ time machine but I would like you to join me on a thought experiment in to our future. This is not a speculative future but one that is going to be upon us very soon and one that we need to think about now, particularly as educators.

    So with the flux capacitor warming up, let us begin, the year is 2030, Elon Musk is electrifying all possible modes of transport, Richard Branson is selling trips to the moon, we have given up the internal combustion engine and we are traveling to work in automated pods that carry us swiftly and silently to our places of work. So far, so Jetsons, whilst in our pod sipping our coffee and reading (it’s still a thing) the newspaper on our tablet a pedestrian strays in front of us. The pod is in fully automated mode, there is no time to avoid a collision. If the pod swerves it will hit a wall, which in all likelihood will end not so well for the occupant, if it continues it will hit the pedestrian with equally dire consequences.

    So what does it do? This automated, algorithm driven machine. How does it make a decision?

    Does it save the occupant? She owns it, it’s licensed to her, does it have an obligation to protect her?

    Does it, in the split second it has, consider the effects of braking, including the rate of deceleration and the subsequent force of impact on the pedestrian and calculate the odds of survival for each individual and favour the one that is most likely to survive?

    Does it use its internal face recognition software to determine the age of the pedestrian and use this as a factor in its decision making?

    The answer to these questions, at least at the moment are, who knows, we’re not there yet. But given the recent news regarding the Facebook bots that were able to develop their own language it is not a question that we can put off for too long and here, at least in my opinion is why we need to be having these conversations with governments and educational institutions now.

    Behind every automation at some point is a programmer, someone that has set the wheels in motion. They have designed the lines of code that will interpret, in our example above for instance, the speed of the car, it’s distance from other vehicles or the condition of the road. Armed with these sensor inputs the algorithm makes decisions based on logic, programmed by its maker.

    These algorithms are truly beautiful and programmers and designers really are brilliant creative people. With the advances in technology it is clear that there will be more and more people working in these industries in the future.

    As educators one of our roles, at least I believe, is to try to prepare young people as best we can for the challenges that they are likely to face in the world after they leave us. So this starts to beg the question how can we prepare young people for a future that involves automation and a deeper interaction with algorithms?

    More importantly, as more and more of our students take up jobs programming algorithms how can we provide them with the types of experiences that will help them to navigate the ethical challenges this type of work will present?

    As experiential learning practitioners we believe that providing young people with experiences where, negotiation, team work, leadership, co-operation and problem solving are key to this problem. Along with well structured and supported reflective practice. These situations, properly developed and mediated allow young people to develop risk management strategies, engage in debate and compromise and importantly for the question posed at the beginning of this article an opportunity to develop logic and reason.

    It has been well documented by others that in the UK there is a need for more experiential learning, particularly around risk. This was highlighted by Amanda Spielman, Chief Inspector for Ofsted writing in the Sunday Telegraph recently. Whilst Ms Spielman highlights the need for children not to be wrapped up in cotton wool as it is “limiting their opportunity to fully take advantage of the freedom of childhood, and to explore the world around them.” It is also limiting their ability to understand and therefore manage risk.

    With this limited and sanitised approach to risk management, how will this generation fully understand the consequences of actions they take in these new digital roles.

    Creating these environments where children and young people can not only explore challenging activities but also reflect on their experiences with the support of trained practitioners is something I am passionate about offering and supporting.

    An algorithm driven world is inevitable, how we approach it, manage it and live with it is not, as those responsible for equipping the next generation we must work together now to ensure they are given the support and guidance they need.