The most important stakeholders in the platform economy are the workers. Platform work can provide employment – or additional employment – for economically marginalised individuals. Accordingly, platform workers often exist on the margins of the labour market, vulnerable to various social, physical, and economic risks. Not all platform workers are economically marginalised, however, and there are differential experiences of the centrality of platform incomes to workers’ livelihoods.
When considering workers as stakeholders, we must also take into consideration the workers’ families and/or community as these actors are also affected by the issues of platform work and may also condition workers’ access and experiences within the labour market.
Labour contracts: Platforms often strive to avoid establishing employer-employee relationships by defining workers as freelancers, service providers, or independent contractors. Nevertheless, platforms often assume controls associated with employers. Workers are thus sometimes falsely defined as self-employed. Contracts may also be informal or offered intermittently, creating precarious labour conditions. Some workers, though, enjoy the flexibility afforded by these contracts.
Platform workers are often unable to avail of rights and protections extended to those designated as employees.
Some workers, such as creators who are paid by brand partners but remain reliant on platforms to broker deals, do not fit readily into traditional models of employment.
Exploitation: Workers are often undercompensated for their labour due to opaque pricing structures and dynamically adjusted rates of pay.
Wage theft: Wage theft can be direct, as in the case of the withholding of payment for completed work, unpaid tips, or under-payment. It may also relate to unpaid work time such as that involved in preparing and maintaining profiles, creating content, travelling to fulfil a task, and managing end-user relationships.
Algorithmic management: Platform work may be allocated, managed, assessed, and remunerated using opaque computational systems. Algorithms can determine visibility to clients or audiences, thus shaping access to the labour market, or can be used to evaluate worker performance and given the capacity to terminate contracts. They can also embed racist and sexist logics, creating exclusions and inequitable conditions. There are typically slow or non-existent appeal processes for algorithmic decisions. Contributing to the opaqueness of algorithms is their impermanence as business operations demand constant adjustments of their algorithmic models and as machine learning changes their parameters.
However, this is not to say that workers are complaisant. Many empirical studies show that workers do respond and develop various strategies to deal with algorithmic management (e.g. fissures in algorithmic power).
Safety and health: Workers are typically responsible for managing their own occupational health and safety (OHS) conditions, including mental health. Algorithmic management, such as dynamic pricing, can set unachievable or risky tempos and conditions for workers or bleed work into formal leisure time, affecting well-being for workers, their families, and significant others. Workers are not protected from harassment and bullying on the platforms or in material conditions of their work. Furthermore, the solitary nature of various forms of platform work may also negatively impact workers’ well-being.
Ratings/recognition: Workers are often rated by end user customers and these data are mobilised in determining employment status and the allocation of further jobs. These ratings may reflect sexist, ableist and racist dynamics. There is an unequal power distribution between workers and customers in many ratings systems. Additionally, worker recognition and satisfaction may take different shapes in platform-mediated work, as there is a relative absence of human-to-human management and the work often takes place in solitude.
Privacy: For some workers, especially those involved in platform-based sex work, robust privacy protections are needed to avoid stigma and to ensure safety. Privacy can be compromised by platform design and access requirements, including needing to provide identity documents to join services. Workers who are socially marginalised by gender, sex, sexuality, or race can be more at risk from harm after breaches of privacy than others.
Social organising/representation: Due to their (mis)classification as self-employed, platform workers may not be recognised by labour unions nor in any collective labour agreements. This impacts their right to collective bargaining and, consequently, their labour conditions.
Information asymmetries: There are substantial information assymetries that disempower workers. Workers and service recipients often have limited information about the way platforms and algorithmic management work; workers have little information about their clients/service recipients and the jobs they should perform; and service recipients often have more information about the worker than workers have about clients.
Power asymmetries: Platforms are rife with power asymmetries in which workers fare badly; for example, service recipients have more information on workers (see above) and also have more power to rate workers. Platforms often have unilateral power to change policies and systems that impact workers.
Work/life: Platform work can create temporalities that extend work beyond formal working time. This can be into unpaid labour time or be related to the unsustainable tempos of work. The flexibility of platform work may also enable better management of work/life, and especially the capacity to earn income around other non-work responsibilities such as child care.
Worker specificity: Labour markets are shaped by a great variety of intersectional factors such as race, gender, age, sex, migration status, ethnicity, and class, leading to differential access to work, economic and social outcomes, and employment experiences. The specificities of platform work for particular workers need to be understood better and mapped more extensively.
Gender: Gender is an important variable in the experience of platform work and the capacity to generate income effectively. Women and non-binary people have historically been an economically marginalised group and can find work opportunities within the platform economy that may otherwise be unavailable. However, some studies show that platform work replicates and reinforces traditional gender roles.
Algorithms, rating systems, infrastructure design, and platform policies can work to discriminate against women, female-identifying, and trans workers, reducing their access to, and success on, the platform. Platforms, including cooperative platforms, often fail to address gender specificity in their design or to consider specific gender-based OHS issues, especially harassment and gender-based violence. Platforms also often lack gender equity policies or fail to implement them effectively.
Regulators, legislators, and policymakers do not adequately capture gender within their data collection and subsequently do not factor it into policy design, reducing the protections for women or female-identifying workers.
Offline gender politics can shape how some platform workers value and price their work, with women often less willing to increase prices or seek higher paid work. Service users may also actively discriminate on the basis of gender in selecting workers.
Race: Race and racism is also an important variable in how platform work is experienced. There is relatively little data on racial identity and its impact on platform work that can be used to inform regulations, platform design, and policies.
Service users can overtly discriminate against workers on racial grounds but there are also structural elements of platform design, including algorithms, that embed racist principles. Platform and economic model design are typically based on a normative user who is white and has the relatively privileged experience of white people. Platforms also often lack anti-discrimination policies or only implement them loosely.
Algorithms, ratings systems, infrastructures, and governance policies can entrench inequalities in income distribution, work in ways that cause inadvertent harms to racially marginalised workers, and/or actively discriminate. Racially marginalised platform workers are often required to spend energy, and thus unpaid work-time, managing their profiles and client relationships to divert the effects of racial politics on their livelihoods.
Migration status: Many platform workers are migrants, and are sometimes undocumented. They are thus more vulnerable to fluctuations in platform work allocation, less able to seek redress, and access social services to sustain a livelihood. They may also have limited access to the financial services or documentation required to secure work or payments via the platforms.
Disability: People with disabilities have low participation in the workforce and poor employment outcomes and this is also true within the platform economy. Platform work can provide a valuable avenue for work but it is necessary that people with some disabilities are given appropriate training and development in workplace skills, digital literacies, and entrepreneurial practices. Platform design can make some forms of work inaccessible. There is very little research engaging with the impact of disability on platform work.
Age: Various studies warn how platforms engage in age discrimination by inquiring about workers’ age and using age-related proxies. Furthermore, cultural discourses propagated by platforms centre younger workers as being more ‘attractive’ and ‘flexible’. This is especially relevant to platforms in which identities and physiques of workers are visible and made relevant in worker selection by end users/service recipients.
Children and adolescents are also integral parts of the digital technology and platform landscape, causing shifts of concepts and perceptions regarding childhood and labour. Features of ‘work’ can be found in the leisure activities of the young population on digital platforms, such as in the work of child influencers or streamers. These new forms of child labour are typically not included in legal definitions of child labour nor in child protection legislation, exposing children to new risks of exploitation.
Digital literacy and accessible design: Digital literacy shapes access to work. Not all individuals possess the same level of such skills or information/knowledge required to engage in platform work effectively or at all. Opportunities to gain these skills can be stratified through gender, class, racial lines and relate also to a worker’s disability status.
Access to technology/devices and connectivity: To be able to engage with the platform economy, workers need to have access to digital devices and internet connectivity. Not everyone has the means to pay for the required data and some studies show how many location-based platform workers make use of free Wi-Fi networks in public spaces, which also exposes them to privacy and data breaches.
Workers may also need to invest in infrastructure such as cars or bicycles, as well as technical training programmes, that introduce financial burdens. Workers may need to take loans to cover such costs, leaving them at risk in cases of platform failure or changes in labour conditions, and placing them in conditions of debt.
Secure and stable labour contracts; fair, perhaps minimum, wages and conditions; more substantial debate on what constitutes working time and what costs should be remunerated; adequate OHS policies and enforcement; transparent and equitable management practices; migration and labour law amendments; gender and equality regulation and protections; labour protections, including for children and migrants; enhanced social protection and welfare safety nets; the right to organise; robust consent and data management systems; more data on all intersectional categories and their impact on experiences and outcomes of platform work; training and literacy programmes in all elements of platform work; effective and fully implemented equality and anti-discrimination policies informing design and management of all aspects of platform work; worker voice in business and technical decision-making.
End users, consumers and clients
Customers, clients or consumers of platform labour are important stakeholders in the platform economy. They seek a cheap, scalable workforce that is available on demand, but this workforce is often invisible to their ostensible employers. Clients’ willingness to pay and the threshold of these payments are key to worker incomes and the viability of platforms. In some forms of platform work, they also generate some of the labour and OHS conditions (for instance, restaurants providing facilities for delivery riders). They also generate ratings that contribute to the algorithmic management of workers.
Clients or customers of platform work may be individuals or companies.
Wages: The desire for cheap labour can place downward pressure on incomes for platform workers as they compete to secure clients. Clients can also refuse to compensate workers upon completion of a task.
Trust: Clients need to be able to trust the workers and the quality of the work being managed by the platforms. Rating systems enable the building of this trust, although they are often unfairly applied by clients and platforms. Clients may be unaware of the impact of their ratings when manifested in algorithmic management systems. Reliable and trustworthy algorithms are also required by clients to ensure the best outcomes.
Efficient processing: Clients can demand work at tempos or in temporalities that may not be feasible or which negatively impact the physical and/or mental health of workers.
Inequalities: Clients can impose ratings on workers based on misogynist, racist, homophobic, or transphobic principles, leading to inequalities among workers.
More transparency about the impact and application of ratings on workers; minimum wage and labour conditions requirements for platform clients; escrowing of worker payments; reduced capacity to reject work unilaterally; increased oversight by platforms of client demands.