December 29, 2024

Ferrum College : Iron Blade Online

Complete Canadian News World

Tinybuild’s CEO admits that the publisher uses AI tools to monitor employees

Tinybuild’s CEO admits that the publisher uses AI tools to monitor employees

Tinybuild president Alex Nichiporchik says the company uses artificial intelligence to identify toxic workers, which, according to the CEO, also include those who are experiencing burnout.

Nichiporchik shows how the company is using AI in the workplace during development: Brighton’s talk titled “AI in Gamedev: Is My Job Safe,” introduced a section called “AI for HR” and discussed how the technology can be used to comb through employee communications and identify “problem” workers.

In an excerpt from the talk published by Why Now GamingNichiporchik explains how the file hello neighbours The publisher feeds text from Slack messages, automated transcriptions from Google Meet and Zoom, and task managers used by the company, into ChatGPT to perform a “me-and-me analysis” that allows Tinybuild to measure “the likelihood of a person going into burnout.”

Nichiporchik said the technology, which he claims to have invented, is a “weird black mirror” and involves using ChatGPT to monitor the number of times workers said “me” or “me” in meetings and other correspondence.

Why? Because he claims there is a “direct relationship” between how often someone uses these words, compared to the amount of words they use in general, and the likelihood that they will experience fatigue at some point in the future.

Toxicity and fatigue? Tinybuild boss claims they’re ‘the same’

Notably, Nichiporchik also suggests that the AI ​​tool could be used to identify “problematic players on the team” and compare them to “A players” by looking for those who talk a lot in meetings or who type a lot. He calls these employees “time vampires”, because from his point of view they are essentially time wasters, and he notes “Once that person is not with the company or the team, the meeting takes 20 minutes and we get five times more over.”

At one point, Nichiporchik suggests, AI was used to identify a studio leader who was not in a “good place” and help him avoid burnout. On the surface, this might seem like a win, but the CEO goes on to suggest that “toxic people are usually people who are overworked,” adding, “They’re the same.”

A slide from Nichiporchik's development talk indicates that there is no distinction between toxicity and fatigue

Image via Why Now Gaming / Develop

Equating toxicity with burnout is misleading at best, but even more troubling is the fact that, earlier in the conversation, Nichiporchik talked about removing questionable players from the company to increase productivity. In this case, that would apparently include those who suffer from burnout.

At the end of his talk, Nichiporchik said companies should look to use AI tools in a positive way and avoid making workers “feel like they’re being spied on.” It’s carefully worded advice that it might be possible to spy on your employees, but only if you don’t allow them to “feel” as if they’re being spied on.

Game Developer reached out to Tinybuild for more information on how it uses AI in the workplace.

Update (07/14/23): Reply to the original story for Why Not Gaming on TwitterNichiporchik claims that some of the slides in his presentation were taken out of context, and suggests that using AI in the workplace is “not about identifying problem employees,” but rather “providing HR tools to identify and prevent people from burnout.”

Nichiporchik then points to another slide featured in the presentation, which states, “99 percent of the time you identify burnout, it’s already too late.” He also notes that the title slide for the “Artificial Intelligence for Human Resources” section has been swapped for a new slide that reads “Artificial Intelligence to Prevent Burnout,” which he describes as “more subtle.”

“The ethics of such processes are certainly questionable, and this question was brought up during the Q&A after the presentation,” Nichiporchik continues, noting the understanding that turning employee chatter into an AI tool in an attempt to assess the performance and mental state of workers is ethically questionable. . “That’s why I say ‘mirror area too black.’ That part was hypothetical. What you can do to prevent burnout.”

The CEO closed his Twitter thread by asking for feedback from the Develop: Brighton attendees who watched the presentation in “full context.”

He said, before stating that Why Now Gaming’s coverage comes from “a place of hate”:

to update (07/14/23): In a separate response sent directly to Why Now Gaming, Nichiporchik said that the HR department of his pitch was just “virtual” and that Tinybuild does not use AI tools to monitor employees.

The statement reads: “The HR part of my presentation was hypothetical, hence the Black Mirror reference. I could have made it clearer by watching out of context.” “We don’t monitor employees or use AI to identify problems. The presentation explored how AI tools can be used, and some of them get into creepy territory. I wanted to explore how they can be used for good.”