There's no doubt that artificial intelligence (AI) is making working life easier.

 

Up until now, many employees have been using tools such as DeepL Siri, Google Translate and so on.

 

So it's hardly surprising that companies and their staff are now trying to use increasingly sophisticated AI technical capabilities, such as ChatGPT, to make work easier.

 

For example, a major international law firm recently revealed an exclusive partnership with OpenAI, the company behind ChatGPT, concerning an application that saves lawyers many hours of work.

 

Some companies train their employees in the best way to use these tools and guide them correctly. Others prohibit their use and block corresponding websites on their network, but most have no guidelines or policies in place... a mistake.

 

The use of AI, and even the introduction of tools like ChatGPT, brings with it a host of legal issues that have yet to be fully resolved.

 

From the point of view of collective labor law, the question arises in particular as to whether an employee delegation has a say in the introduction of instruments such as ChatGPT.

 

In Germany, an initial decision on this issue has just been handed down by a labor court.

 

The employer wanted to authorize its employees to use generative AI as a new work support tool. To this end, it published guidelines on its intranet concerning the use of AI-enabled IT tools in the workplace.

 

However, the employee representatives felt that the authorization to use ChatGPT, combined with the publication of the guidelines, seriously violated their co-determination rights.

 

In particular, they asked the employer to block ChatGPT and prohibit its use.

 

When the employer refused, the group works council applied for an interim injunction against the employer.

 

The labor court rejected the injunction, ruling that the works council's co-determination rights had not been violated.

 

In Luxembourg, the same question arises as to whether the introduction of an AI system, such as ChatGPT, must be approved by the staff delegation in companies employing at least 150 employees, or whether the delegation need only be informed and/or consulted in advance?

 

As far as the delegation's right to participate in company decisions is concerned, the introduction or application of ChatGPT is not, in principle, intended to monitor the behavior or performance of employees at their workstations. As such, delegation approval would not be required.

 

However, if ChatGPT or any other AI system were to be introduced by means of internal regulations, the agreement of the staff delegation would be required.

 

The same would apply to the increasingly frequent integration of an AI-equipped applicant tracking system, as this would probably fall within the scope of the procedure for establishing general criteria for personal selection in the event of recruitment by the employer, and therefore subject to the agreement of the staff delegation. This would also raise issues of personal data protection.

 

In any case, as the staff delegation must be informed and consulted on any issue relating to the improvement of working conditions and employment of employees, the introduction of an AI system, or even ChatGPT, within the company can only take place after prior information and consultation of the staff delegation, and this regardless of the number of employees working for the company.

 

In view of the increasingly recurrent use of AI by employers and employees alike, there will undoubtedly be many disputes and, consequently, court rulings in this area in the future.

 

For more information on CASTEGNARO-Ius Laboris Luxembourg, click here.