Computer-aided personnel selection processes are now used by many companies.

They determine the personality of job applicants and can find out whether an applicant is willing and able to provide the services desired by his future employer.

However, if an applicant has cleared the hiring hurdle, the use of digital tools as a basis for personnel decisions can fall on employers' toes.

This is shown by a more recent judgment by the Hesse State Labor Court (of February 25, 2021 - 12 Sa 1435/19), which dealt with the effectiveness of the dismissal of a flight attendant.

The defendant airline had terminated the employment of a flight attendant after she had failed an IT-supported test procedure.

The test served to find out whether an employee was at risk of being a so-called insider.

It contained around 200 self-assessment questions that were evaluated automatically.

The test attested to the flight attendant's potential for radicalization.

From the airline's point of view, it was a safety risk.

The airline resigned due to personal reasons.

Wrongly so, as the LAG determined.

During the course of the procedure, she was unable to explain how the IT-supported test procedure had come to this conclusion.

It was therefore not possible for the court to independently review the flight attendant's personality assessment.

For the use of AI and other digital tools in the world of work, this means that employers must at least be able to understand the personnel decisions made on the basis of IT/AI themselves.

Otherwise they run the risk of losing a legal dispute based on it.

Doris-Maria Schuster is a partner and Sven Ole Klingler is a research associate at the law firm Gleiss Lutz in Hamburg.