We gathered with Veena Dubal, Zephyr Teachout and Zubin Soleimany to discuss how algorithmic systems are impacting workers. Their conclusions were clear – these technologies are being used to prioritize increasing profits and decreasing worker control through surveillance and tracking of workers. To quote Zephyr, “when power is centralized, data amplifies that power.” More often than not, workers don’t even know when AI systems are being applied or how their performance is being evaluated.

Bios

Veena Dubal is a Professor of Law at University of California Irvine. She is a scholar and attorney who specializes in employment and labor law. Her research –a combination of ethnography, history, and critical theory– sits at the nexus of law, technology and precarious work.

Zephyr Teachout is an attorney, political activist, and anti-trust and corruption expert. She is Professor of Law at Fordham University and recently completed a term as Senior Counsel for Economic Justice in the Office of New York Attorney General Letitia James.

Zubin Soleimany is Senior Counsel at the New York Taxi Workers Alliance (“NYTWA”). The NYTWA is a union of professional New York City drivers who work in the yellow taxi, black car, and green cab industries that fights for fair working conditions for drivers in all sectors of New York’s taxi and for-hire vehicle industries by challenging employee misclassification where it exists, and fighting for the creation and enforcement of livable income standards for drivers through City regulation.

Highlights

  • One thread that keeps coming up is the obscurity of this space. How much is unknowable? I’d love to unpack that a little bit and understand the little that we do know, how have we been able to uncover some of these practices and where do we need regulators or policymakers to go further?
    • Veena Dubal: Traditionally, we have this sense that when you go to your next job, you are in a position to share some information and to not share other information. They can ask you certain questions about how much you were paid in the previous job, and you can choose not to answer, you could choose to answer honestly, you could choose to an answer dishonestly, but you have something that you can withhold, which ends up being, particularly for middle wage workers, kind of like their only bargaining tool in relationship to their employer. But if an employer can purchase everything that is known about you from a previous workplace, no matter that that information may be flawed, might be wrong, even if it’s not variable compensation, they can set your compensation in ways that are really unfair and have you have very little control over.
  • What are you seeing on the ground as workers grapple with the uses of artificial intelligence and push for change?
    • Zephyr Teachout: I think the black box point is really important and it’s a black box at a societal level, at an industry level and at a worker level. The obfuscation is part of the exercise of power itself. When I look at this moment, I see a few different things. One is that there’s some workplaces that have always been totally surveilled, home care workers, if not total, pretty close to total even before this or the capacities for close to total care. But there are really significant moments in the rise of surveillance and really significant changes in the nature of the technology.
    • Zephyr Teachout: One part of the surveillance workplace is what is being surveyed and the other part is the way in which that surveillance translates into wage differential, differential treatment, different bonuses.
  • In what ways is regulating AI in the context of the workplace distinct? What policy measures do we need given the power inequalities and information asymmetries that are specific to the workplace?
    • Zubin Soleimany: Perhaps it’s not so much the algorithm that needs to be regulated itself but the outcome and then perhaps the rest can follow.
  • One thread that keeps coming up is the obscurity of this space. How much is unknowable? I’d love to unpack that a little bit and understand the little that we do know, how have we been able to uncover some of these practices and where do we need regulators or policymakers to go further?
    • Veena Dubal: In the context of doing deep research with workers, finding out what they’re sort of experiencing, but that again is really limited to a subset of workers that we have access to. We only know what’s happening in certain sectors. And that too, our knowledge is very limited because so much of it is hidden from us. What we know is kind of limited to Uber drivers and Amazon workers because those are the workers who have gone to the media. Those are the workers who researchers are studying. For example, we know very little about how people are experiencing keystroke software, what information is gleaned from that, how that transforms the day, et cetera. We know very little about what is happening with face surveillance software.

Further Readings