Digital Surveillance: Potential Effects on Workers and Roles of Federal Agencies

GAO-25-107126Published: Sep 02, 2025. Publicly Released: Nov 24, 2025.

Fast Facts

Employers monitor workers for many reasons such as tracking their safety or productivity. This Q&A examines digital surveillance’s effects on workers. It’s our second on this topic.

Digital surveillance can positively and negatively affect workers’ physical and mental health. For example, it can alert them to potential health problems and increase their sense of safety. But it can also increase their anxiety or risk of injury by pushing them to move faster to meet productivity targets.

 

Highlights

What GAO Found

Digital surveillance tools can provide employers with information to help improve their operations and may have positive and negative effects on workers. Sometimes referred to as "bossware," these tools include cameras, microphones, computer monitoring software, geolocation trackers, phone applications, and devices worn by workers, among other tools.

GAO's work is based on interviews with stakeholders from 11 organizations: two trade associations, three advocacy organizations, and six research organizations. It is also based on a review of 122 studies that met GAO's standards for methodological quality.

According to stakeholders GAO interviewed and studies GAO reviewed, digital surveillance can have the following effects on workers:

  • Physical health and safety. Digital surveillance can both positively and negatively affect workers' physical health and safety. For example, some digital surveillance tools can identify cardiac issues, an indication of potential heart disease. Conversely, they can increase workers' risk of injuries by pushing them to move faster to meet productivity metrics.
  • Mental health. Digital surveillance can both positively and negatively affect workers' mental health. Positive effects can include increasing workers' sense of safety. Negative effects can include increased stress and anxiety. These effects can depend on employers' practices, including how transparent they are about what information they collect.
  • Employment opportunities. The design or incorrect use of some digital surveillance tools could limit their ability to accurately assess performance. For example, digital surveillance tools may use flawed productivity benchmarks, may not account for the full range of worker tasks and responsibilities, or may be used by the employer for unintended purposes. These types of limitations could make some workers more prone to experiencing negative effects on employment opportunities such as low performance evaluations, lower pay, disciplinary actions, or termination.

The Equal Employment Opportunity Commission (EEOC), the National Labor Relations Board (NLRB), and the Department of Labor's (DOL) Occupational Safety and Health Administration investigate claims that could involve digital surveillance. Several federal agencies have also previously provided guidance or resources to employers about the use of digital surveillance, but in 2025 these agencies have either rescinded these prior efforts or are reassessing their alignment with the current administration's priorities.

Why GAO Did This Study

Employer surveillance of workers has become more widespread as the number of people working remotely has increased and the types of surveillance technologies available have expanded. Employers monitor workers for a variety of reasons, such as tracking their performance, productivity, and health and safety.

Some worker advocates have expressed concern about the effects of digital surveillance on workers. GAO was asked to examine the potential effects of digital surveillance on workers' physical health and safety, mental health, and employment opportunities, as well as federal agencies' oversight of employers' use of this technology. This report provides information on these topics and follows a report GAO issued in 2024 that summarized public comments submitted to the White House Office of Science and Technology Policy (OSTP) about employers' use of automated digital surveillance tools to monitor workers and the effects of such surveillance on workers (GAO-24-107639).

GAO identified stakeholders to interview based on their published research or advocacy in this area, and the recommendations of other experts.

GAO identified and reviewed 122 studies about the effects of digital surveillance on workers' physical health and safety, mental health, and employment opportunities. All studies were published from 2020 through 2024 and were assessed by GAO for methodological rigor. While the studies were subject to certain limitations that could potentially affect their findings, GAO deemed them sufficiently robust for inclusion in this review. GAO also reviewed relevant laws, regulations, and agency information.

GAO interviewed knowledgeable officials from the DOL, EEOC, NLRB, OSTP, and the Consumer Financial Protection Bureau and requested updated information from them in spring of 2025.

For more information, contact Thomas Costa at costat@gao.gov.

 


 

 

Full Report

 PDF Format

 

Why This Matters
Key Takeaways

Page 2 GAO-25-107126 Digital Surveillance
(OSHA) investigate claims that could involve digital surveillance. Several
federal agencies have also provided guidance or resources to employers
about the use of digital surveillance but have either rescinded these prior
efforts or are reassessing their alignment with the current administration’s
priorities.
Digital surveillance can positively or negatively affect workers’ physical health
and safety depending on how employers use the technology, according to
stakeholders we interviewed and studies we reviewed. We reported in prior work
that employers use various types of digital surveillance tools (see fig.1).1
Figure 1: Types of Digital Surveillance Tools Used by Employers
Examples of potential positive effects
• Increased awareness of physical health and safety. Ten of the 11
stakeholders we spoke with said that digital surveillance tools can alert
workers about potential physical health and safety problems when used for
that purpose. For example, a researcher said that assembly line workers,
such as those in factories or warehouses, may use wearables that notify
them when their heart rate is too high, signaling that they should take a
break. Another researcher said wearables used by oil and gas workers can
detect chemicals or other workplace hazards, such as extreme heat. Also, 36
of the studies we reviewed specifically looked at physical health and safety.
Thirteen of these studies found that digital surveillance tools can alert
workers about potential physical health and safety problems.2 For example,
one study found that a digital surveillance tool containing sensors in the
steering wheel of a car or truck could identify if drivers are experiencing
cardiac issues and sleepiness, indicators of potential heart disease and sleep
apnea.3
• Decreased risk of injuries. Eight stakeholders said that digital surveillance
tools can decrease workers’ risk of injuries when they are used to monitor
workers’ safety. For example, one researcher said that tools that scan the
workplace for hazards can identify spills and thus reduce slip hazards.
How can employers’
use of digital
surveillance affect
workers’ physical
health and safety?

Page 3 GAO-25-107126 Digital Surveillance
Another researcher said that some technology detects warehouse workers’
locations and delivers objects to them, which may reduce injuries for some
workers, including older workers and workers with disabilities. Additionally, 13
studies found that digital surveillance can decrease the risk of injury.4 For
instance, one study found that wristband sensors that may be worn by
workers in the construction industry can identify unsafe behavior, helping to
reduce physical injuries.
5 Previously, we reported that wearables could
reduce the risk of injuries from strenuous work or workers colliding withequipment, and may improve response time to emergencies.6
Examples of potential negative effects
• Increased risk of injuries. Seven stakeholders said that when employers
use digital surveillance tools to monitor productivity (i.e., the amount of work
that workers complete), they may push workers to move faster, which may
increase their risk of injury. For example, one researcher said that digital
surveillance tools can create unrealistic time frames for delivery drivers that
do not account for factors such as traffic, the driver’s physical condition, or
the delivery location. To meet these time frames, delivery drivers may take
risks that result in accidents and physical injuries. Four studies also found
that digital surveillance increased workers’ risk of injury.7 Previously, we
reported that the rate of injury can increase when employers use surveillance
tools to monitor workers’ productivity and push them to work faster.8
• Increased physical ailments. Three stakeholders said that that when
employers use digital surveillance tools to monitor productivity, workers’
physical ailments could be exacerbated. One researcher said that digital
surveillance can make workers feel as if they cannot take breaks, which can
cause physical stress. According to another researcher, the strain from digital
surveillance can cause headaches, and a decreased ability to recover from
illnesses. Additionally, three studies found that digital surveillance can
contribute to greater fatigue among workers.9
The way in which employers use digital surveillance can positively or negatively
affect workers’ mental health, according to stakeholders we interviewed and
studies we reviewed.
Example of a potential positive effect
Three stakeholders said that employers’ use of digital surveillance tools to
monitor for safety can increase workers’ sense of safety. For example, one trade
association representative said that when workers know their workplace is being
monitored for security purposes, it can reduce their fear of workplace violence.
Similarly, a researcher and representatives from another trade association said
digital surveillance tools may reduce anxiety for workers who work by themselves
in remote locations when they know that their company will be aware if a
situation arises where they need help or are not safe. Additionally, two of the 38
studies we reviewed about mental health found that workers feel safer when
employers use digital surveillance for this purpose.10
Examples of potential negative effects
Stress, anxiety, depression, and other negative mental health effects can result
from lack of transparency, continuous surveillance, and productivity monitoring.
• Lack of transparency. Seven stakeholders said that employers’ lack of
transparency about digital surveillance can increase workers’ stress and
How can employers’
use of digital
surveillance affect
workers’ mental
health?

Page 4 GAO-25-107126 Digital Surveillance
anxiety. This lack of transparency can include workers not knowing what
information employers collect about them, who has access to that
information, and how that information is used. Additionally, six studies found
that a lack of transparency regarding how employers use digital surveillance
negatively affects workers’ mental health.11 For example, one study found
that workers may feel demoralized when employers do not explain their intent
for monitoring workers.12
• Continuous surveillance. Six stakeholders said that continuous surveillance
may negatively affect workers’ mental health. For example, one
representative from an advocacy organization said that workers may
experience anxiety and stress when they are continuously monitored.
Additionally, eight studies found that constant surveillance negatively affects
workers’ mental health.13 For example, one of these studies found that
workers who were continuously monitored reported feeling anxious and
demoralized.14 Another study found higher rates of depression among gig
workers who were constantly tracked through platforms.15
• Productivity monitoring. Five stakeholders said that workers may
experience negative mental health effects when employers use digital
surveillance tools to monitor their productivity. For example, a trade
association representative said that when this happens, workers may feel
stressed because digital surveillance tools do not detect the underlying
reasons for dips in productivity. Two researchers explained that productivity
monitoring often makes workers feel forced to move faster. This may lead
workers to cut their break time, which exacerbates job strain and triggers
negative mental health effects. A study also found that when digital
surveillance is used to monitor workers’ productivity, workers feel pressured
to work faster, increasing stress and making them feel like they do not have
control over their work.16
Limitations in digital surveillance tools due to flaws in productivity benchmarks or
productivity measures can negatively affect workers’ performance evaluations,
according to stakeholders we interviewed.
• Flaws in productivity benchmarks. According to seven stakeholders,
employers can use flawed productivity benchmarks that may not represent
performance across their whole workforce. Benchmarks are set by analyzing
the productivity of a group of workers. However, the sample used to set the
benchmark might not be representative of the full workforce, containing fewer
workers with disabilities, older workers, or female workers, for example,
according to two researchers. This might set productivity levels that are not
representative of the larger workforce.
• Flaws in productivity measures. Four stakeholders said that productivity
measures can have design flaws when they do not account for the full range
of workers’ tasks. For example, one researcher said that to accurately
measure workers’ productivity, digital surveillance tools must measure tasks
that workers are expected to perform. However, some tools do not measure
offline activities that are harder to track, such as time spent on research,
reading, or helping others. In such cases, digital surveillance tools can make
workers appear less productive for spending time on tasks that may be
important but are harder to measure.
How can limitations in
digital surveillance
tools affect workers’
performance
evaluations?

Page 5 GAO-25-107126 Digital Surveillance
When employers misinterpret or misuse data collected by digital surveillance
tools, workers’ employment opportunities could be negatively affected, according
to stakeholders we interviewed. These negative effects could include reprimands,
low performance evaluations, lower pay, reduced work hours, or termination.
Examples of potential misinterpretation
• Not understanding workers’ responsibilities. All 11 stakeholders said that
employers could misinterpret data about workers’ productivity, which could
stem from a lack of understanding about workers’ full range of tasks and
responsibilities. For example, one researcher said that digital surveillance
tools could misread a worker’s productivity if certain tasks are not measured
or if off-screen time for research, reading, thinking, or mentorship is not
accounted for. Also, employers who assess workers’ productivity solely using
data from digital surveillance tools may improperly label workers as
unproductive. This can occur when employers do not understand their
workers’ responsibilities or how their digital surveillance tools measure
productivity, according to another researcher.
• Believing tools do not make mistakes. Three stakeholders said some
employers believe that digital surveillance tools do not make mistakes. One
researcher said employers’ overestimation of the accuracy of these tools may
lead them to trust the tools over their employees. Similarly, another
researcher said that employers take the data collected through digital
surveillance at face value, not understanding that these tools could
underestimate workers’ performance. For example, a researcher said that
tone recognition software used in call centers could penalize workers if their
tone is not cheerful, even if a cheerful tone is inappropriate for the nature of
the call. Such software can also wrongly penalize workers with accents.
Without looking more closely at the data or understanding its limitations, a
manager may accept biased results that do not accurately capture the
workers’ performance. Additionally, two studies found that employers could
have an overly optimistic view of data collected from digital surveillance tools
and not fully understand their limitations.
17
Examples of potential misuse
• Making employment decisions without human review. Eight stakeholders
said that employers could misuse data collected from digital surveillance to
make employment decisions without human review. This could negatively
affect workers’ employment opportunities. A researcher said that when
workers are managed by digital surveillance tools, they have fewer
opportunities to speak to a supervisor about issues that affect their
performance. For example, a housekeeper may not be able to finish
preparing a hotel room when towels are not available. When the housekeeper
is being managed through digital surveillance, rather than an onsite
supervisor with whom she can discuss the issue, the housekeeper may get
reprimanded. Additionally, one study found that about a third of participants
expressed concerns that employers could misuse digital surveillance to make
employment decisions such as firing or denying benefits and promotions to
workers.18
• Using tools for unintended purposes. Seven stakeholders said that
employers could misuse digital surveillance tools by using them for
unintended purposes. For example, trade association representatives said
that most digital surveillance tools were designed to monitor workers’ safety
and security, and therefore could do a poor job when used to measure
How could employers’
misinterpretation or
misuse of data from
digital surveillance
affect employment
opportunities for
workers?

Page 6 GAO-25-107126 Digital Surveillance
productivity. This could lead employers to use inaccurate data to make
employment decisions.
According to stakeholders we interviewed and studies we reviewed, certain
groups of workers may be more likely to experience negative effects on
employment opportunities—such as low performance evaluations, lower pay,
disciplinary actions, or termination—from employers’ use of digital surveillance.
These groups include:
• Workers of certain races and ethnicities. Seven stakeholders said that
employment opportunities for workers of certain races and ethnicities may be
negatively affected by employers’ use of digital surveillance tools. For
example, one researcher said that emotional monitoring technology, which
some employers use to evaluate workers, may disproportionately misidentify
workers of some races as expressing negative emotions. Also, two studies
found that some digital surveillance tools could inaccurately assess the
performance of Black workers.19 Additionally, another researcher said that
workers of certain ethnicities may be more vulnerable to the negative effects
of digital surveillance tools because they disproportionately have jobs that
rely on the tools to assess their performance. Since digital surveillance tools
could be prone to errors, workers of certain races and ethnicities may be
more likely to experience negative effects on their employment opportunities.
These negative effects could include disciplinary actions, poor performance
evaluations, and decreased advancement opportunities.
20
• Female workers. Six stakeholders said that digital surveillance can
negatively affect female workers’ employment opportunities. Digital
surveillance tools may not measure complex yet important contributions,
according to four stakeholders we interviewed. For example, one researcher
told us that digital surveillance tools cannot measure building relationships
and working collaboratively, contributions often made by women. Given the
inability of digital surveillance tools to measure this kind of leadership activity,
women may be passed up for promotions. Additionally, five of the 26 studies
about the effects of digital surveillance on employment opportunities found
negative effects for women.21 One study found that some digital surveillance
tools could feed into employers’ existing stereotypes about women’s behavior
in the workplace.22 For example, according to the study, tools that monitor
workers' emotions could flag women as behaving inappropriately if they
disagree with their manager. Employers could then use this data as
justification for firing female workers while shielding themselves from potential
discrimination claims, according to the study’s authors.
• Workers with disabilities. Six stakeholders said that workers with
disabilities may also face negative effects on their employment opportunities
when employers use digital surveillance tools.23 For example, a researcher
said that when employers use digital surveillance tools to monitor workers,
workers with disabilities are disproportionately disciplined and receive
negative performance evaluations. This can lead to lower pay and fewer
career advancement opportunities. Additionally, representatives from an
advocacy organization said that some workers with disabilities may be afraid
to ask for reasonable accommodations and could get characterized as low
performers as a result.
• Older workers. Five stakeholders said that older workers’ employment
opportunities may be negatively affected by employers’ use of digital
surveillance. For example, a researcher said that older workers who may
Which groups of
workers may
experience negative
effects on employment
opportunities from
employers’ use of
digital surveillance?

Page 7 GAO-25-107126 Digital Surveillance
need frequent breaks during the day for health reasons may skip breaks to
avoid being flagged as unproductive. Additionally, two studies found that
digital surveillance can negatively affect older workers’ employment
opportunities.24 For example, one study found that older workers could have
difficulty meeting productivity metrics because these benchmarks may have
been developed without accounting for enough older workers in the
workforce.25 This could put these workers at a disadvantage in terms of
receiving promotions or rewards when employers use digital surveillance
tools.
• Workers with accents. Four stakeholders said that digital surveillance tools
that are used to monitor speech can negatively affect employment
opportunities for workers with accents.26 These tools may have difficulty
detecting the speech and tone of workers with accents. For example, a
researcher said that workers with accents might be penalized for lacking
clarity if an employer uses voice monitoring software to evaluate
performance. A researcher said that surveillance tools in call centers may
incorrectly register higher error rates for workers with accents for not
complying with call scripts. This can lead to reprimands or terminations.
Some federal and state requirements may affect employers’ digital surveillance
of workers.
• Federal. Title III of the Omnibus Crime Control and Safe Streets Act of 1968,
as amended by Title I of the Electronic Communications Privacy Act of 1986
(known as the “Wiretap Act”) generally prohibits intentionally intercepting
wire, oral, or electronic communications by using an electronic, mechanical,
or other device unless one party consents.27 For example, intercepting
workers’ personal phone calls without their consent could violate the Wiretap
Act. The Act provides for certain exceptions, however, such as when a
provider of electronic communication services intercepts communications in
the normal course of business (e.g., for quality control purposes). The Act
does not apply to other forms of monitoring that do not intercept wire, oral, or
electronic communications, such as tracking devices.
• State. Some states have laws that may affect employers’ use of digital
surveillance. For example, such laws include requiring consent by both
parties (the employee and employer) for interception of certain
communication, restricting the placement of digital surveillance tools, and
prohibiting employers from monitoring employees’ private conversations.
Certain federal agencies are statutorily required to investigate claims from
workers that involve their area of oversight, including claims that may stem from
employers’ use of digital surveillance. Agency officials told us that they enforce
relevant laws but do not track which specific claims involve the use of digital
surveillance.
• Equal Employment Opportunity Commission (EEOC). The EEOC
enforces federal laws that prohibit employment discrimination by investigating
charges of discrimination related to workers’ race, color, religion, sex,
national origin, disability status, age, or genetic information. In some cases,
this could include charges involving the use of digital surveillance.28
• National Labor Relations Board (NLRB). The General Counsel of the
NLRB enforces the National Labor Relations Act by investigating allegations
of unfair labor practices brought by workers, unions, or employers. In March
2023, the General Counsel of the NLRB and the Director of the Consumer
Financial Protection Bureau (CFPB) signed a memorandum of understanding
What federal and state
requirements may
affect employers’
digital surveillance of
workers?
What role do federal
agencies have in
investigating workers’
complaints regarding
digital surveillance?

Page 8 GAO-25-107126 Digital Surveillance
to share information to support their respective missions, which could include
addressing practices involving employer surveillance of workers.29 This
memorandum remained in effect as of July 2025, according to NLRB officials.
• Department of Labor’s (DOL) Occupational Safety and Health
Administration (OSHA). OSHA’s mission is to ensure the safety and health
of workers. In June 2025, OSHA officials said that they would investigate
complaints regarding adverse effects on employees’ health or safety,
including those stemming from digital surveillance.
The NLRB, DOL, EEOC, and CFPB have taken various steps to reduce the
potential negative effects of digital surveillance on workers. This includes
providing guidance or resources for employers. Since January 2025, these
agencies have either rescinded these past efforts or are currently reviewing them
to ensure that they align with the current administration’s priorities. In addition, in
2023, OSTP collected information from the public about workers’ experiences
with digital surveillance but has since removed this information from its website.
Past efforts to develop guidance
• Guidance on the right to unionize. In October 2022, the General Counsel
of the NLRB issued a memorandum explaining that digital surveillance may
infringe on workers’ right to organize under the National Labor Relations Act.
Specifically, the memo stated that digital surveillance could severely limit or
prevent employees from organizing and keeping their efforts confidential from
their employer. The General Counsel, when reviewing charges and issuing
complaints, planned to urge the Board to adopt a framework to apply the Act
to protect employees from intrusive or abusive electronic monitoring and
automated management practices that could interfere with certain rights
under the Act. In February 2025, the NLRB’s Acting General Counsel
rescinded the October 2022 memorandum after a review of active General
Counsel memoranda. NLRB officials said that this was done as part of an
initiative to refocus resources on the agency’s core mission.
• Best practices for worker well-being. In October 2024, DOL published best
practices for employers’ use of digital surveillance tools with artificial
intelligence (AI) components to monitor employees. These best practices
included (1) having human oversight of surveillance tools; (2) being
transparent with employees about the use of digital surveillance, the
information that is collected, and procedures for employees to correct the
data used to make important employment decisions; and (3) ensuring that
digital surveillance does not unfairly disadvantage certain groups of workers
with regard to employment decisions.
In January 2025, DOL had removed the best practices from its website. In
June 2025, DOL officials told us that they are reviewing all materials on their
website to make sure that they align with the new administration’s priorities.
• Guidance on digital surveillance for workers with disabilities. Through its
initiative on accessible technology, DOL identified ways that digital
surveillance can create the risk of discrimination against workers with
disabilities and encouraged employers to develop best practices to reduce
these risks. In June 2025, DOL officials told us that they had removed this
resource from the agency’s website as part of their review to ensure that
available resources align with current policy. They said that when this review
is complete, they will determine what resources to make available on their
website.
What information have
federal agencies
provided to employers
to reduce the potential
negative effects of
digital surveillance?

Page 9 GAO-25-107126 Digital Surveillance
Additionally, in 2022, the EEOC issued technical assistance to employers
regarding promising practices they can implement to comply with the
Americans with Disabilities Act when using algorithms, which may include
digital surveillance tools, to make employment-related decisions. For
example, employers can develop alternative ways to evaluate workers when
the current evaluation process is inaccessible or otherwise unfairly
disadvantages someone who has requested a reasonable accommodation
because of a disability. EEOC officials told us that they removed this
document from their website while officials assess its compliance with an
Executive Order that was issued in January 2025.30
• Guidance on consumer protections for workers. In October 2024, the
CFPB issued guidance explaining that longstanding consumer protections
may apply to consumer reports about workers that are obtained through
digital surveillance, like they are for traditional credit reports.31 Specifically,
the guidance explained that companies using third-party consumer reports
about their workers for employment purposes—including background
dossiers and surveillance-based scores—have obligations under the Fair
Credit Reporting Act. These generally include obtaining a worker’s consent,
providing notice about data used in adverse employment decisions, and
providing notice of how to dispute inaccurate information. The CFPB
rescinded this and other guidance in May 2025, citing efforts to reduce
compliance burdens among other reasons.32
Past efforts to collect information
In May 2023, the White House Office of Science and Technology Policy (OSTP)
had requested information from the public—including private and public sector
workers—to better understand the prevalence, uses, purposes, and deployment
of automated digital surveillance tools, including effects of these tools on
workers’ physical and mental health, privacy, and ability to exercise workplace
rights. In 2024, OSTP published responses to its request for information
regarding experiences with the use of automated worker surveillance and
management.33 As of July 2025, the responses are no longer available on
OSTP’s website.
We provided a draft of this report to the CFPB, DOL, EEOC, and NLRB for
review and comment. We received technical comments from each of these
agencies, which we incorporated as we deemed appropriate.
To describe the effects of digital surveillance on workers, we interviewed
stakeholders and reviewed studies and relevant GAO reports. We interviewed
stakeholders from 11 organizations: two trade associations, three advocacy
organizations, and six research organizations (see app. I for a list of the
organizations). We identified these stakeholders based on their published
research or advocacy in this area and the recommendations of other experts. We
also consulted with GAO technologists and data scientists about how digital
surveillance tools are developed, the limits they may have, and how they can be
misused.
Additionally, we reviewed studies about the effects of digital surveillance on
workers’ mental and physical health and the potential effects on employment
opportunities that may result from the use of these tools. Some questions in this
report do not include a discussion of studies because we did not identify studies
that pertained to those specific topics.
We followed a rigorous process to identify and assess the studies. To identify
studies, we conducted keyword searches of various databases, such as Scopus,
Agency Comments
How GAO Did This
Study

Page 10 GAO-25-107126 Digital Surveillance
ABI/Inform, ProQuest Research Library, and Social SciSearch. We searched for
phrases such as “automated surveillance of workers,” “algorithmic management,”
“digital surveillance of workers,” and “electronic surveillance.” We also asked the
stakeholders we interviewed to recommend studies. We limited our studies to
those that were published from 2020 through 2024 to (1) capture the increase in
telework that occurred after the onset of the COVID-19 pandemic and (2) obtain
recent information concerning the use of digital surveillance technology given its
rapidly evolving nature. Through this process, we identified 249 studies. Of
these, we determined that 67 studies were not germane to our report.
Next, we assessed the methodological quality of the remaining 182 studies. To
start, one GAO analyst reviewed the studies to determine whether they met
GAO’s minimum standards for inclusion in our report. A knowledgeable GAO
expert then reviewed the studies’ findings and methods to ensure the
methodologies were appropriate and sufficiently rigorous. Following this
assessment, we removed 60 studies because we determined that their methods
were not sufficiently appropriate or rigorous.
The remaining 122 studies met our criteria for methodological rigor and are used
in this report as supporting evidence for our findings. We used qualitative
software to analyze them and identify themes across their findings. While these
studies had certain limitations, we determined that they were sufficiently robust
for inclusion. Some of these limitations include:
• Limited scope: studies that do not examine different ways technology can be
implemented are limited in their ability to account for potential differences of
the effects of these technologies under varied circumstances.
• Reliance on self-reported information: studies that rely on self-reported
information can introduce bias and limit the use of these data for causal
inference. Examples of self-reported information include responses to
surveys and interviews.
• Limited number of clinical studies: studies that do not use clinical measures
to study the effects on physical and mental health limit the ability to assess
clinical effects. There were only a small number of clinical studies in the
articles we analyzed.
• Lack of longitudinal analysis: studies that do not follow people over time are
limited in their ability to identify long-term effects or changes in people’s
behavior.
We also reviewed relevant GAO reports to provide further context about the
effects of digital surveillance on workers.
To describe federal oversight of digital surveillance and guidance provided to
employers, we interviewed knowledgeable officials from the CFPB, DOL, EEOC,
NLRB, and OSTP and requested updated information from them in spring 2025.
Additionally, we reviewed relevant information published by these agencies,
including best practices, a memorandum of understanding, and descriptions of
their oversight or investigation activities regarding digital surveillance. We also
reviewed relevant laws and regulations pertaining to the use of digital
surveillance technology to monitor workers.
We conducted this performance audit from October 2023 to September 2025 in
accordance with generally accepted government auditing standards. Those
standards require that we plan and perform the audit to obtain sufficient,
appropriate evidence to provide a reasonable basis for our findings and
conclusions based on our audit objectives. We believe that the evidence
obtained provides a reasonable basis for our findings and conclusions based on
our audit objectives.

Page 11 GAO-25-107126 Digital Surveillance
The Honorable Robert C. “Bobby” Scott
Ranking Member
Committee on Education and Workforce
House of Representatives
As agreed with your office, unless you publicly announce the contents of this
report earlier, we plan no further distribution until 30 days from the report date. At
that time, we will send copies to the appropriate congressional committees and
other interested parties. In addition, the report will be available at no charge on
the GAO website at https://www.gao.gov.
For more information, contact: Thomas Costa, Director, Education, Workforce,
and Income Security, costat@gao.gov.
Public Affairs: Sarah Kaczmarek, Managing Director, Media@gao.gov.
Congressional Relations: A. Nicole Clowers, Managing Director,
CongRel@gao.gov.
Staff Acknowledgments: Mary Crenshaw (Assistant Director), Hedieh Fusfield
(Analyst in Charge), Dustin Cohan, and Yasmine Evans. In addition, key support
was provided by Shea Bader, James Bennett, John Bornmann, Rosanna
Guerrero, Stacia Odenwald, Aaron Olszewski, Kathleen van Gelder, Kristen
Watts, and Adam Wendel.
Connect with GAO on Facebook, X, LinkedIn, Instagram, and YouTube.
Subscribe to our Email Updates. Listen to our Podcasts.
Visit GAO on the web at https://www.gao.gov.
This is a work of the U.S. government but may include copyrighted material. For
details, see https://www.gao.gov/copyright
Stakeholder Organizations Interviewed by GAO
Advocacy Organizations
Technology Institute of the American Federation of Labor and Congress of
Industrial Organizations (AFL-CIO)
American Civil Liberties Union
Coworker.org
Research Organizations
Center for AI and Digital Policy
Centre for Research into Information, Surveillance and Privacy, Edinburgh
Law School
Human-Computer Interaction Institute, Carnegie Mellon University
Partnership on AI
School of Human Resources and Labor Relations at Michigan State
University
University of California, Berkeley Center for Labor Research and Education
List of Addressees
GAO Contact
Information
Appendix I

Page 12 GAO-25-107126 Digital Surveillance
Trade Associations
Electronic Security Association
Security Industry Association
1GAO, Digital Surveillance of Workers: Tools, Uses, and Stakeholder Perspectives, GAO-24-
107639 (Washington, D.C.: August 2024). This report summarized public comments submitted to
OSTP through a request for information on the use of automated digital surveillance tools to
monitor workers and the effects of such surveillance on workers. (Request for Information;
Automated Worker Surveillance and Management, 88 Fed. Reg. 27,932 (May 3, 2023)).
2Devanash Atray and Rejesjwar Dass, “Employee Health Monitoring System for Industry 4.0,” in
Emergent Converging Technologies and Biomedical Systems, ed. Shruti Jain, Nikhil Marriwala, C.
C. Tripathi, and Dinesh Kumar (Springer, 2022), 255–265; Srikanth Bangaru, Chao Wang, and
Fereydoun Aghazadeh, “Automated and Continuous Fatigue Monitoring in Construction Workers
Using Forearm EMG and IMU Wearable Sensors and Recurrent Neural Network,” Sensors, vol. 22
(2022); Jordan Cahoon and Luis Garcia, “Continuous Stress Monitoring for Healthcare Workers:
Evaluating Generalizability Across Real-World Datasets,” in BCB ’23: Proceedings of the 14th ACM
International Conference on Bioinformatics, Computational Biology, and Health Informatics, (2023);
Caroline Clingan et al.,” Monitoring Health Care Workers at Risk for COVID-19 Using Wearable
Sensors and Smartphone Technology: Protocol for an Observational mHealth Study,” JMIR
Research Protocols, vol. 10, no. 5 (2021); Jennifer Cori et al., “An Evaluation and Comparison of
Commercial Driver Sleepiness Detection Technology: A Rapid Review,” Physiological
Measurement, vol. 42, no. 7 (2021); Shanley Corvite, Kat Roemmich, Tillie I. Rosenberg, and
Nazanin Andalibi, “Data Subjects’ Perspectives on Emotion Artificial Intelligence Use in the
Workplace: A Relational Ethics Lens,” Proceedings of the ACM on Human-Computer Interaction,
vol. 7, no. 124 (2023); Carly Harrison, Scott Ruddock, Paul O’Halloran, Susan Mayes, Jill Cook,
and Mandy Ruddock-Hudson, “Wellness Monitoring for Professional Ballet Dancers: A Pilot Study,”
Journal of Dance Medicine & Science, vol. 25, no. 2 (2021); Abdullahi Ibrahim, Muhammad Khan,
Chukwuma Nnaji, and Amanda Koh, “Assessing Non-Intrusive Wearable Devices for Tracking Core
Body Temperature in Hot Working Conditions,” Applied Sciences, vol. 13, no. 6803 (2023);
Muhammad Khan, Abdullahi Ibrahim, Chukwuma Nnaji, and Ashrant Aryal, “Developing Prediction
Models for Monitoring Workers’ Fatigue in Hot Conditions,” Computing in Civil Engineering (2023);
Yun-Soung Kim et al., “Soft Wireless Bioelectronics Designed for Real-Time, Continuous Health
Monitoring of Farmworkers,” Advanced Healthcare Materials, vol. 11, no. 13 (2022); Eric Kirkendall
et al., “Feasibility, Acceptability, and Performance of a Continuous Temperature Monitor in Older
Adults and Staff in Congregate-Living Facilities,” Journal of the American Medical Directors
Association, vol. 23 (2022); Wonil Lee, Ken-Yu Lin, Peter W. Johnson, and Edmund Y. W. Seto,
“Selection of Wearable Sensor Measurements For Monitoring and Managing Entry-Level
Construction Worker Fatigue: A Logistic Regression Approach,” Engineering, Construction and
Architectural Management, vol. 29, no. 8 (2022); and A. Ojha, S. Shakerian, M. Habibnezhad, and
H. Jebelli, “Feasibility Verification of Multimodal Wearable Sensing System for Holistic Health
Monitoring of Construction Workers,” in Proceedings of the Canadian Society of Civil Engineering
Annual Conference 2021, ed. Scott Walbridge et al., vol. 239 (2023), 283–294.
3Cori et al., “An Evaluation and Comparison of Commercial Driver.”
4Ana Arboleya, Jaime Laviada, Yuri Álvarez-López, and Fernando Las-Heras, “Real-Time Tracking
System Based on RFID to Prevent Worker–Vehicle Accidents,” IEEE Antennas and Wireless
Propagation Letters, vol. 20, no. 9 (2021); Oscar Arias, James Groehler, Mike Wolff, and Sang D.
Choi, “Assessment of Musculoskeletal Pain and Physical Demands Using a Wearable Smartwatch
Heart Monitor among Precast Concrete Construction Workers: A Field Case Study,” Applied
Sciences, vol. 13, no. 2347 (2023); Kirstie Ball, Electronic Monitoring and Surveillance in the
Workplace: Literature Review and Policy Recommendations, Publications Office of the European
Union (Luxembourg: 2021); Bangaru, Wang, and Aghazadeh, “Automated and Continuous Fatigue
Monitoring”; Aarti Bansal, Rajesh Khanna, and Surbhi Sharma, “Platform Tolerant RFID Tag
Antenna Design for Safety and Real-Time Tracking of On-site Workers at Riskier Workplaces,”
International Journal of Antennas and Propagation, vol. 2023 (2023); Cori et al., “An Evaluation and
Comparison of Commercial Driver”; Suyra Garimella, Ahmed Senouci, and Kyungki Kim,
“Monitoring Fatigue in Construction Workers using Wearable Sensors,” Construction Research
Congress (2020); Harrison, Ruddock, O’Halloran, Mayes, Cook, and Ruddock-Hudson, “Wellness
Monitoring for Professional Ballet Dancers”; Mohamed Zul Fadhli Khairuddin et al., “Occupational
Injury Risk Mitigation: Machine Learning Approach and Feature Optimization for Smart Workplace
Surveillance,” International Journal of Environmental Research and Public Health, vol. 19, no.
13962 (2022); Khan, Ibrahim, Nnaji, and Aryal, “Developing Prediction Models”; Kim et al., “Soft
Wireless Bioelectronics”; Lee, Lin, Johnson, and Seto, “Selection of Wearable Sensor
Measurements”; and Ojha, Shakerian, Habibnezhad, and Jebelli, “Feasibility Verification.”
Endnotes

Page 13 GAO-25-107126 Digital Surveillance
5Ojha, Shakerian, Habibnezhad, and Jebelli, “Feasibility Verification.”
6GAO, Science & Tech Spotlight: Wearable Technologies in the Workplace, GAO-24-107303
(Washington, D.C.: March 2024).
7Ball, “Electronic Monitoring and Surveillance in the Workplace”; Bangaru, Wang, and Aghazadeh,
“Automated and Continuous Fatigue Monitoring”; Saeed Jaydarifard, Krishna Behara, Douglas
Baker, and Alexander Paz, “Driver Fatigue In Taxi, Ride-Hailing, and Ridesharing Services: A
Systematic Review,” Transport Reviews, vol. 44, no. 3 (2023); and Zoey Laskaris et al., “A Price
Too High: Injury and Assault Among Delivery Gig Workers in New York City,” Journal of Urban
Health, vol. 101, (2024).
8GAO, Workplace Safety and Health: OSHA Should Take Steps to Better Identify and Address
Ergonomic Hazards at Warehouses and Delivery Companies, GAO-24-106413 (Washington, D.C.:
September 2024).
9Jaydarifard, Behara, Baker, and Paz, “Driver Fatigue”; Laskaris et al., “Injury and Assault Among
Delivery Gig Workers”; and Kat Roemmich, Florian Schaub, and Nazanin Andalibi, “Emotion AI at
Work: Implications for Workplace Surveillance, Emotional Labor, and Emotional Privacy,” in CHI
’23: Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems, ed.
Albrecht Schmidt et al., (2023).
10Ball, “Electronic Monitoring and Surveillance in the Workplace”; and Paul Bowell, Gavin Smith,
Ekaterina Pechenkina, and Paul Scifleet, “‘You’re Walking on Eggshells’: Exploring Subjective
Experiences of Workplace Tracking,” Culture and Organization, vol. 29, no. 6 (2023).
11Ball, “Electronic Monitoring and Surveillance in the Workplace”; Bowell, Smith, Pechenkina, and
Scifleet, “Subjective Experiences of Workplace Tracking”; Tingru Cui, Barney Tan, and Yunfei Shi,
“Fostering Humanistic Algorithmic Management: A Process Of Enacting Human-Algorithm
Complementarity,” Journal of Strategic Information Systems, vol. 33 (2024); Mareike Möhlmann,
Carolina Alves de Lima Salge, and Marco Marabelli, “Algorithm Sensemaking: How Platform
Workers Make Sense of Algorithmic Management,” Journal of the Association for Information
Systems, vol. 24 no. 1 (2023); Bradley Pitcher, Ahleah Miles, Peter Mancarella, and Tara Behrend,
“Socioeconomic and Job Status Differences in the Experience of Perceived Unacceptable
Electronic Performance Monitoring,” Technology, Mind, and Behavior: Special Collection:
Technology, Work, and Inequality (2022); and Angie Zhang, Alexander Boltz, Chun-Wei Wang, and
Min Kyung Lee, “Algorithmic Management Reimagined For Workers and By Workers: Centering
Worker Well-Being in Gig Work,” in CHI ’22: Proceedings of the CHI Conference on Human Factors
in Computing Systems, (2022).
12Bowell, Smith, Pechenkina, and Scifleet, “Subjective Experiences of Workplace Tracking.”
13Ball, “Electronic Monitoring and Surveillance in the Workplace”; Bowell, Smith, Pechenkina, and
Scifleet, “Subjective Experiences of Workplace Tracking”; Luc Cousineau, Ariane Ollier-Malaterre,
and Xavier Parent-Rocheleau, “Employee Surveillance Technologies: Prevalence, Classification,
and Invasiveness,” Surveillance & Society, vol. 21, no. 4 (2023), 445-468; Cui, Tan, and Shi,
“Fostering Humanistic Algorithmic Management”; Thomas Kalischko and René Riedl, “On the
Consequences of Electronic Performance Monitoring in Organizations: Theory and Evidence,”
Digital Transformation and Society, vol. 3, no. 1 (2024), 50-79; Roemmich, Schaub, and Andalibi,
“Emotion AI at Work”; Louisa Scheepers, Peter Angerer, and Nico Dragano, “Digitalisation in Craft
Enterprises: Perceived Technostress, Readiness for Prevention and Countermeasures—A
Qualitative Study,” International Journal of Environmental Research and Public Health, vol. 19
(2022), 1-16; and Mauren Wolff, Jerod White, Martin Abraham, Claus Schnabel, Luisa Wieser, and
Cornelia Niessen, “The Threat of Electronic Performance Monitoring: Exploring the Role of LeaderMember Exchange on Employee Privacy Invasion,” Journal of Vocational Behavior, vol. 154
(2024), 1-18.
14Bowell, Smith, Pechenkina, and Scifleet, “Subjective Experiences of Workplace Tracking.”
15Maya De Los Santos, Kimberly Do, Michael Muller, Saiph Savage, “Designing Sousveillance
Tools for Gig Workers,” CHI '24: Proceedings of the 2024 CHI Conference on Human Factors in
Computing Systems, no. 384 (2024), 1-19.
16Scheepers, Angerer, and Dragano, “Perceived Technostress.”
17Panagiota Koukouvinou and Jonny Holmström, “AI Management Beyond Myth and Hype: A
Systematic Review and Synthesis of the Literature,” Pacific Asia Journal of the Association for
Information Systems, vol. 16, no. 2 (2024); and Peter Mantello, Manh‑Tung Ho, Minh‑Hoang
Nguyen, and Quan‑Hoang Vuong, “Bosses Without a Heart: Socio-Demographic and CrossCultural Determinants of Attitude towards Emotional AI in the Workplace,” AI and Society, vol. 38
(2023).

Page 14 GAO-25-107126 Digital Surveillance
18Corvite, Roemmich, Rosenberg, and Andalibi, “Emotion Artificial Intelligence Use in the
Workplace.”
19Corvite, Roemmich, Rosenberg, and Andalibi, “Emotion Artificial Intelligence Use in the
Workplace”; and Roemmich, Schaub, and Andalibi, “Emotion AI at Work.”
20Corvite, Roemmich, Rosenberg, and Andalibi, “Emotion Artificial Intelligence Use in the
Workplace”; and Roemmich, Schaub, and Andalibi, “Emotion AI at Work.”
21Ball, “Electronic Monitoring and Surveillance in the Workplace”; Corvite, Roemmich, Rosenberg,
and Andalibi, “Emotion Artificial Intelligence Use in the Workplace”; Behnoush Jovari, “Artificial
Intelligence Ethics in Organizational Human Resources Management,” International Journal of
Management, Accounting and Economics, vol. 11, no. 7 (2024); Roemmich, Schaub, and Andalibi,
“Emotion AI at Work”; and Jennifer Jiang, Isabell Lippert, and Armin Alizadeh, “Workers’ Perceived
Algorithmic Exploitation on Online Labor Platforms,” in Forty-Fourth International Conference on
Information Systems, Hyderabad (2023).
22Roemmich, Schaub, and Andalibi, “Emotion AI at Work.”
23We did not identify any studies in our review that met our standards for methodological quality
and explicitly addressed the effects of digital surveillance on employment opportunities for workers
with disabilities.
24Ball, “Electronic Monitoring and Surveillance in the Workplace”; and Jovari, “Artificial Intelligence
Ethics.”
25Jovari, “Artificial Intelligence Ethics.”
26We did not identify any studies in our review that met our standards for methodological quality
and explicitly addressed the effects of digital surveillance on employment opportunities for workers
with accents.
27See 18 U.S.C. §§ 2510–2521.
28As of January 28, 2025, the EEOC no longer has a quorum of its leadership panel of
Commissioners. Although it has no quorum, the EEOC continues to enforce federal
antidiscrimination laws, according to agency officials. Officials also told us that the lack of a quorum
does not impact the EEOC’s intake, processing, investigation, or resolution of charges of
discrimination.
29The CFPB was created to provide a single point of accountability for enforcing federal consumer
financial laws and protecting consumers in the financial marketplace.
30Specifically, the EEOC cited Exec. Order No. 14,179, 90 Fed. Reg. 8741 (Jan. 23, 2025).
31Consumer Financial Protection Bureau, Consumer Financial Protection Circular 2024-06:
Background Dossiers and Algorithmic Scores for Hiring, Promotion, and Other Employment
Decisions (Oct. 24, 2024).
32See 90 Fed. Reg. 20,084 (May 12, 2025).
33Request for Information; Automated Worker Surveillance and Management, 88 Fed. Reg. 27,932
(May 3, 2023)