via Future of Privacy Forum
What if your boss was an algorithm? Imagine a world in which artificial intelligence hasn’t come for your job – but that of your manager: whether it’s hiring new staff, managing a large workforce, or even selecting workers for redundancies, big data and sophisticated algorithms are increasingly taking over traditional management tasks. This is not a dystopian vision of the future. According to Professor Jeremias Adams-Prassl, algorithmic management is quickly becoming established in workplaces around the world.
We aren’t necessarily defenceless or impotent in the face of machines – and might even want to (cautiously) embrace this revolution.
Should we be worried? Last month’s A-level fiasco has shown the potential risks of blindly entrusting life-changing decisions to automation. And yet, the Oxford law professor suggests, we aren’t necessarily defenceless or impotent in the face of machines – and might even want to (cautiously) embrace this revolution. To work out how we should go about regulating AI at work, he has been awarded a prestigious €1.5 million grant by the European Research Council.
This will require a serious rethink of existing structures. Over the course of the next five years, Professor Adams-Prassl’s project will bring together an interdisciplinary team of computer scientists, lawyers, and sociologists to understand what happens when key decisions are no longer taken by your boss, but an inscrutable algorithm.
Employers today can access a wide range of data about their workforce, from phone, email, and calendar logs to daily movements around the office – and your fitbit. Even the infamous 19th century management theorist Frederick Taylor could not have dreamt of this degree of monitoring. This trove of information is then processed by a series of algorithms, often relying on machine learning (or ‘artificial intelligence’) to sift data for patterns: what characteristics do current star performers have in common? And which applicants most closely match these profiles?
What we’re seeing now is a step change: algorithms have long been deployed to manage workers in the gig economy, in warehouses, and similar settings. Today, they’re coming to workplaces across the spectrum, from hospitals and law firms to banks and even universities.
‘Management automation has been with us for a while’, notes the professor. ‘But what we’re seeing now is a step change: algorithms have long been deployed to manage workers in the gig economy, in warehouses, and similar settings. Today, they’re coming to workplaces across the spectrum, from hospitals and law firms to banks and even universities.’ The Covid-19 pandemic has provided a further boost, with traditional managers struggling to look after their teams. As a result, the algorithmic boss is not just watching us at work: it has come to our living rooms.
That’s not necessarily a bad thing: algorithms have successfully been deployed to catch out insider trading, or help staff plan their careers and find redeployment opportunities in large organisations. At the same time, Professor Adams-Prassl cautions, we have to be careful about the unintended (yet often entirely predictable) negative side effects of entrusting key decisions to machine learning. Video-interviewing software has repeatedly been demonstrated to discriminate against applicants based on their skin tone, rather than skills. And that sophisticated hiring algorithm may well spot the fact that a key pattern amongst your current crop of senior engineers is that they’re all men – and thus ‘learn’ to discard the CVs of promising female applicants. Simply excluding gender, race, or other characteristics won’t cure the problem of algorithmic discrimination, either: there are plenty of other datapoints, from shopping habits to post codes, from which the same information can be inferred. Amidst a burgeoning literature exploring algorithmic fairness and transparency, however, the workplace seems to have received scant attention.
Understanding the technology is key to solving this conundrum: what information is collected, and how is it processed?
Existing legal frameworks, designed for the workplace of the last century, struggle to keep pace: they threaten to stifle innovation – or leave workers unprotected. The GDPR prevents some of the worst instances of people management (no automated sacking by email, as is the case in the US) – but it’s nowhere near fine-grained enough a tool. Understanding the technology is key to solving this conundrum: what information is collected, and how is it processed?
‘There’s nothing inherently bad about the use of big data and AI at work: beware any Luddite phantasies’, the professor insists. But employers should tread carefully: ‘Yes, automating recruitment processes might save significant amounts of time, and if set up properly, could actively encourage hiring the best and most diverse candidates – but you also have to watch out: machine learning algorithms, by their very nature, tend to punish outliers.’
Backed by the recently awarded European Research Council (ERC) grant, his team will come up with a series of toolkits to regulate algorithmic management. The primary goal is to take account of all stakeholders, not least by promoting the importance of social dialogue in reshaping tomorrow’s workplace: the successful introduction of algorithmic management requires cooperation in working out how best to adapt software to individual circumstances, whether in deciding what data should be captured, or which parameters should be prioritised in the recruitment process.
It’s not simply a question of legal regulation: we need to look at the roles of software developers, managers, and workers. There’s little point in introducing ‘AI for AI’s sake’, investing in sophisticated software without a clear use case. Workers will understandably concerned, and seek to resist: from ripping out desk activity monitors to investing in clever FitBit cradles which simulate your workout of choice.
‘There’s no such thing as the future of work’, concludes Professor Adams-Prassl. ‘When faced with the temptation of technological predeterminism, always remember to keep a strong sense of agency: there’s nothing inherent in tech development – it’s our choices today that will ensure that tomorrow’s workplace is innovative, fair, and transparent.’
The Latest Updates from Bing News & Google News
Go deeper with Bing News on:
- An Ethical Framework for Artificial Intelligence—Part IIIon September 4, 2020 at 1:10 pm
This column will review the remaining two principles in Responsible AI and examine how this work relates to the recent report issued by the European Commission, writes Technology Law columnist.
- Enterprise Artificial Intelligence Market Drivers and Key Players Strategies Analyzed Till 2023 | Impact of COVID-19 Pandemicon September 4, 2020 at 4:39 am
Enterprises hold a prominent market value considering their services and solution created for different functions and business uses. These enterprises largely deploy technologies such as machine ...
- Danforth Center to play role in new $20M federal artificial intelligence instituteon September 3, 2020 at 12:08 pm
The initiative is part of what the federal government said it is it's largest-ever investment in artificial intelligence research.
- Artificial Intelligence Market Outlook, Size, Share and Growth and Forecast Assumptions Through 2026 | Impact of COVID-19 Pandemicon September 2, 2020 at 9:11 pm
According to a new study published by Polaris Market Research the Global Artificial Intelligence Market is anticipated ...
- APTIM and ehsAI Team Up to Revolutionize Asset-Related Compliance Management Using Artificial Intelligenceon September 2, 2020 at 8:11 am
Environmental, APTIM, “Leveraging our integrated engineering, maintenance, and environmental compliance experience with ehsAI technol ...
Go deeper with Google Headlines on:
Go deeper with Bing News on:
- How technology can help miners comply with new global tailings standard guidelineson September 6, 2020 at 5:31 am
Robin Bolton, head of sustainability at IsoMetrix, said that linking datasets and asset performance data is key.
- Why I Like CDC Exercise Guidelineson September 5, 2020 at 9:20 am
I love the idea of the CDC handing down exercise guidelines. The more we can do collectively to close the gap of perceived importance between active and passive management of disease (including aging) ...
- Management seeks to impose gag order on US postal workerson September 4, 2020 at 10:26 pm
A leaked memo reveals new guidelines imposed by US Postal Service management to suppress information about internal conditions at post offices.
- Employee Engagement Solutions Are Hot, TikTok Algorithm Makes Deal Difficult, More Newson September 4, 2020 at 9:09 am
This year, the release of research organizations’ reports on the market is especially worth noting. And this week, Aragon Research published its Hot Vendors report for this highly contested market.
- State says MPA safety guidelines fall short, urges further delay to high school sports this fallon September 3, 2020 at 4:52 am
After a review of the Maine Principals' Association's COVID-19 safety guidelines, the state says high-risk sports like football and volleyball might not be feasible.