The book, titled 'The Algorithm', has pulled the current on how the hiring world is becoming a ‘Wild West’ where unregulated AI algorithms make decisions without human oversight

The book, titled ‘The Algorithm’, has pulled the current on how the hiring world is becoming a ‘Wild West’ where unregulated AI algorithms make decisions without human oversight 

AI has taken over the job market by reading resumes and watching interviews to provide human executives with the best candidates, a new book has revealed. 

The book, titled ‘The Algorithm,’ has pulled the curtain on how the hiring world is becoming a ‘Wild West’ where unregulated AI algorithms make decisions without human oversight.

Artificial intelligence decides who gets hired and who gets fired by monitoring everything from what people post on social media to their tone of voice in interviews, the book’s author, Hilke Schellmann, told DailyMail.com.

Algorithms can now dictate not only who gets job interviews – but, thanks to continuous on-the-job monitoring, who gets promoted or fired (and they might even warn your boss if you are getting divorced).

Schellmann said the CEO of ZipRecruiter told him a few years ago that the tech was screening at least 75 percent of resumes.

‘That was 2021; it’s probably 100 percent now. We know that 99 percent of Fortune 500 companies already use AI tools in hiring,’ she said.

Schellman said that it’s inevitable that if you apply today, your resume will be screened by AI long before it’s dealt with by a human – and offers tips on how to be seen in an AI-driven workplace.

AI tools recruiters use are unreliable, and even recruiters don’t know how they work, adding that the AI-enabled job application process is rife with discrimination and ‘blatantly weird’ keywords are used, she explained.

Artificial intelligence is already deciding who gets hired and who gets fired by monitoring everything from what people post on social media to their tone of voice in interviews, the book's author, Hilke Schellmann, told DailyMail.com

Artificial intelligence is already deciding who gets hired and who gets fired by monitoring everything from what people post on social media to their tone of voice in interviews, the book’s author, Hilke Schellmann, told DailyMail.com 

The AI tools are often ‘black boxes’ where recruiters can’t see how they work.

The tech can develop unusual ideas about a candidate’s likelihood to succeed or predict success based on church attendance or different nationalities – forming a process riddled with discrimination.

That means, for example, that women or disabled people might find themselves discriminated against during the hiring process – but as people don’t know which AI tools have been used, it’s difficult for them to respond.

‘The vendors that build AI tools don’t want to be scrutinized and are reluctant to talk about any problems, said Schellmann.

‘They want to talk about it in glowing marketing terms, right? How wonderful it is to find the best people – but they don’t want to open the black box for testing or transparency.

‘The companies that use these AI tools also often don’t want to talk about it because they feel like reputational harm from applicants being upset that AI is being used and that no humans are looking at their job application.’

And she said that machines do the bulk of rejections.

One former employment lawyer, Matthew Scherer, whom Schellmann spoke to, said the tools used are ‘not ready for prime time.’

That is because the technology is ‘very basic’ and cannot fully predict real-world outcomes such as a person’s success at a job. 

Schellmann described many technologies used to sift resumes as ‘snake oil.’

‘We know it saves money. We know it saves labor, but we have not seen proof that it picks the most qualified candidates,’ she said.

Many organizations are also using AI to assess recordings of video interviews – looking for issues such as the ‘wrong tone of voice,’ said Schellman.

‘Unfortunately, this is mostly legal,’ she continued.

‘The European Union is a little bit more strict with General Data Protection Regulation [GDPR] and other laws, but the United States is still the Wild West on this, except for some local laws that we see.

‘There’s one in Illinois where you must let people know that using AI and video interviews. But overall there isn’t much regulation yet in this.’

AI tools used in interviews pull out ‘biomarkers’ (such as tone of voice or movements) that supposedly correspond to emotions.

‘If you and I are talking, this tool can find out if you’re anxious or depressed based on either tone of voice or when we see the facial expression. Yeah, and the intonation of the voice,’ Schellmann said.

‘What does it mean, a facial expression in a job interview?

‘It doesn’t make you good or bad at a job. We are using these technological signals because we can – but they don’t often have a lot of meaning,’

Employers now also routinely scan social media networks such as X and LinkedIn using AI algorithms – looking for details such as references to songs with violent lyrics.

‘That could mean you are labeled as a violent person and someone that shouldn’t be hired,’ Schellmann said.

Many companies do this as part of a hiring screening stage – but others continuously use such scans on employees.

‘Some of these tools also find things like whether you’re prone to self-harm,’ revealed the author.  

Algorithms can now dictate not only who gets job interviews - but, thanks to continuous on-the-job monitoring, who gets promoted or fired

Algorithms can now dictate not only who gets job interviews – but, thanks to continuous on-the-job monitoring, who gets promoted or fired

‘In the United States, that could be illegal because you’re not allowed to ask people for medical conditions before they’re hired.

‘It’s also a question: why would a company want to know if you’re prone to self-harm? Like, is that actually helpful? Are they helping these people, their employees? Or are they punishing them?’

Companies use AI algorithms to assess people’s personality based on their social media posts, analyzing language to assess what people are ‘really’ like, Schellmann said.

‘Companies want to look under the hood of people, right? They want to know who you are before hiring you,’ Schellmann explained.

Schellmann said that the ability to ‘look under the hood’ of people is something that organizations have craved for decades – leading them to rely on untested or bogus technologies such as handwriting analysis.

The same job is done by AI algorithms poring over videos of job interviews.

Organizations who want to hire someone who is a ‘fast learner’ (so they can adapt to a changing technological world) often rely on such technologies to predict who might be a good fit, Schellmann said.

But relying on personality (and on untested AI algorithms to deliver people with that specific trait) is a mistake, Schellman argues.

Schellman said, ‘What we know from science is that personality is about five percent or so predictive of success in the job. So that’s very, very little.

‘We often overcome our personality. I’m quite shy, and I have to work on that. When I go to receptions and parties, I have to work on approaching strangers. We can overcome our personality at work and in other places.

‘It’s actually questionable, should we use it. But it’s easy to use. It’s super cheap. It’s just an easy way to do it, and that’s why they do.’

How to succeed when AI is reading your resume (and probably  watching your job interview) 

Schellmann advises that applicants should match 60 to 80% of keywords from the job description (not 100 percent) because AI tools may weed you out for simply copying the job description).

Schellmann said, ‘You want to have a super basic resume. The old advice often was to stand out to a human with cool columns and graphics. Don’t have any graphics, like a machine cannot read that: no images, no columns.’

Instead, Schellmann advises that applicants should have bullet points and clear, machine-readable language, short and concise.

She said several website services (including JobScan) can help you see if your resume or application is machine-readable.

You upload the job description and a resume to check the overlap.

The current job market is a ‘cat and mouse’ game where applicants often use AI large language models such as ChatGPT to write covering letters, Schellmann said.

Schellman said, ‘Humans don’t actually craft the text, and the humans don’t evaluate the cover letters and resumes anymore. It’s the machines against machines.’

Schellmann advises using AI systems like ChatGPT to evaluate what are the most likely questions you’ll be asked – and in non-live interviews, ChatGPT can also help you come up with answers.

In interviews, if you’re going to be evaluated by machines, have long answers describing specific scenes, Schellman advised – because the shorter the answer, the harder machines find it to understand.

Schellman said some also suggest looking at the camera to show the algorithms you are ‘engaged.’

Schellman said the other key thing is to apply to as many jobs as possible – even ones you feel unqualified for.

‘There’s a definite difference between women and men: women only apply when they are 100% qualified, while men apply when they are 50 percent. But if machines are rating whether you are qualified or not – apply when you think you are 60% qualified.’

Schellmann said that the answer is to keep applying, even if you have to do it 150 or 200 times.

She said, ‘Don’t be discouraged. It’s like a numbers game, and it can be really frustrating for people. But it’s just machines who read it at the other end and put you on a yes or no pile. 

‘And there’s very little control anyone has of that? So bulk applying is the only way to do it. ‘

This post first appeared on Dailymail.co.uk

You May Also Like

The Coronavirus Is a Threat to the Global Drug Supply

Shoppers have rushed to stores in Toronto, Chicago, New York, and elsewhere,…

Mars: Noises detected by NASA’s Insight lander may suggest planet active with volcanic activity

Scientists now know that things go bump on Mars on a fairly…

YouTuber left in tears after Magic the Gathering sent heavies to take back his trading cards

A MAGIC: The Gathering YouTuber is shaken after the company sent heavies…

Twitter down for more than an hour around world

Site unavailable for users in latest technical difficulty suffered by site since…