Before deciding to attend graduate school, I worked as a writer, editor, and content strategist for a small, faith-based liberal arts university. I was a member of that forgotten “third” rank behind faculty and students: staff. In some respects, the institution functioned more like a church than a work organization. It offered university-sponsored book clubs with faculty and staff as well as optional staff chapel events for worship, prayer, and fellowship. Such opportunities for socializing and fun were not instances of “time theft” but opportunities to strengthen a working community. This differed from my previous employers’ mindsets, where opportunities for personal edification and communal ritual-building were scant.
Things seem to be taking a turn within the “world of work.” Upon a casual survey of online job postings, there is a fair amount of attention now spent on the quality and contour of the work environments themselves. Organizations boast of well-serviced cafés and break rooms redolent of Vegas lounges. They emphasize robust fitness and wellness programs in which employees are welcome—even strongly encouraged—to participate to foster optimal health and befriend a cohort of like-bodied coworkers. They espouse mindfulness meditation and stress relief programs, communal self-centering and calming activities that enable the type of holistic development that has become a staple for young (and not-so-young) urbanites.
These well-rounded work environments dovetail nicely with the rise in workplace flexibility. The number of remote workers and freelancers in the country continues to grow, along with the concomitant rise in common workspace companies. Benefit packages boasting of unlimited paid time off (PTO)—a notion that would have been unthinkable to HR managers a couple of decades ago—are much more commonplace. Many people now expect employers to treat them as holistic, multifaceted individuals with varied needs, preferences, and abilities that can only be properly satisfied through personalized attention.
This marks a departure from the “cog-in-the-machine” environment of classic American workplaces. As long as a job offered a consistent salary, a pension or contributing 401k, and enough time off a year to recharge (a metaphor that itself reveals our tendency to equate human labor with automaton activity), a monotonous or creatively stifling job could still be considered, well, decent. Yet, human beings do not do all that well when treated like ancillary machines along a factory line, and burgeoning jobs within the “knowledge economy” require a logic of labor that is not easily quantifiable. Cal Newport touches on this in his book Deep Work, writing that uninterrupted, focused attention for a relatively short length of time (three to four hours) can result in exponentially greater productivity than a protracted work day, one that often foments inefficiency, boredom, and bruised morale.
In addition to flexible work schedules, less supervision by Big Brother, and feng shui conference rooms, there is a growing expectation that our jobs afford us with greater outlets for self-expression and fulfillment. “Busy work” that grants us an income to do what we actually want to do outside of work—be that pursuing a hobby, spending time with friends and family, or simply binging on Netflix—is not cutting it anymore, even if we are being rewarded handsomely.
In anthropologist David Graeber’s book Bullshit Jobs: A Theory, he believes the modern phenomenon of working jobs, while providing at times hefty compensation packages and desirables reputational status (let us not forget the allure of job titles), are in actuality rather useless (think filling out labyrinthine spreadsheets or crafting robust SOPs that no one will ever read). He coins this type of labor “bullshit work,” which refers to “a form of paid employment that is so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence even though, as part of the conditions of employment, the employee feels obliged to pretend that this is not the case.”
As a result, many companies are heeding the signs of the times to secure the best talent: devising roles that align with the contours of a busy life, harness one’s inimitable skillset, allow for the formation of meaningful community, and render a protracted sense of fulfillment. As with any major cultural and sociological phenomenon, there are a number of factors that contribute to this shift in worker expectations. However, perhaps one of the more conspicuous influences is the way technological innovation has altered the way we work. We cannot deny that great flexibility, for example, becomes a more feasible option when technology allows us to jump on a conference call as we are taking the kids to school or troubleshoot a problem in our in-laws’ guest bedroom.
But not everyone is a fan of technology’s influence on the way we work, and our expectations regarding what our workplaces should provide us. Some see the hegemonic influence of technology as an unredeemable negative: the eventual supplanting of human labor by technology will deprive human beings of the busy work that keeps us meaningfully engaged at best and distracted at worst. Some estimate that by 2030 fifty percent of our jobs will be lost to rapid innovation. And if that is true, what about by 2050? What will we do if there are no jobs left anymore for human beings? How will we manage a legion of jobless, bored, and malaised citizens?
In the other corner, we have the technocratic optimists, those beaming with utopian-laden hope. They claim we will only continue to see new jobs come to the fore, just like we have seen over the last hundred years with the continued expansion of technological innovation. We may no longer have neon sign makers, bookbinders, video-rental store workers, cabbies, or darkroom technicians, but who could have imagined we would now have social media specialists, UX designers, computer systems analysts, and Ruby developers only a couple of decades ago? Others within the optimists’ ranks concede there will be fewer jobs for human beings, especially with respect to ascendant A.I. technologies (ChatGPT obviously comes to mind). However, this will lead to a golden age of leisure, more time to do the things we would do if we were not working all the time: playing music, cooking, reading, writing poetry, crafting handmade furniture, etc. In a sense, we might still consider this a world of work, if we define “work” to merely mean an activity aimed at some outcome. But since this new type of “work” will no longer incorporate toil and monotony. Denuded of its thorns and thistles, it would enable us to achieve fulfillment and that final, elusive step of Maslow’s pyramid. As technology tears away the type of work that many love to hate, we will be left to engage in play-work that aligns with our deepest and most creative desires.
Of course, this is hardly a novel hypothesis about the future of the workforce (or, might we say anti-workforce?). Nearly a century ago, economist John Maynard Keynes imagined a similar future. He wrote that within the lifetime of his contemporaries “we may be able to perform all the operations of agriculture, mining, and manufacture with a quarter of the human effort to which we have been accustomed.” He was right that technology would make us vastly more productive and efficient—more than he could possibly have imagined—yet that shorter workweek has not come to pass. The U.S. continues to work more than most other developed nations. Still, assuming we ever did get to a position where leisure became a prevalent public good, Keynes believed many would still be addicted to an ethic of work due to an “intense, unsatisfied purposiveness,” and would still “blindly pursue wealth” since, well, what else would there be? Although, he did extend hope for at least a portion of humanity, writing:
I see us free, therefore, to return to some of the most sure and certain principles of religion and traditional virtue . . . We shall once more value ends above means and prefer the good to the useful. We shall honour those who can teach us how to pluck the hour and the day virtuously and well, the delightful people who are capable of taking direct enjoyment in things, the lilies of the field who toil not, neither do they spin.
Whether or not we enter an age of leisure in the near future (a doubtful prospect), we are at least beginning to see a collective shift toward this ideal. I think most of us would agree that organizations that recognize the needs of the person beyond financial compensation are onto something good when it comes to human flourishing. Workplaces that allow for deeper co-worker relationships, engagement in meaningful and creative activities, flexible schedules taking into account family life and health, and the treatment of people in holistic and personalized ways are great. But to what degree can we reasonably expect our work environments to constitute these aspects of our lives: a place where ultimate fulfillment and self-expression can be found?
An article from Vox takes a look at the insular and strange subcultures within certain Silicon Valley organizations. From metaphysical “journeys” with the help of a natural drug known ayahuasca to electronic music festivals to tech retreats, these tech sub-communities offer ritualistic experiences that border on the mystical. The article examines the anti-religious sentiments of many working in the tech industry, pointing to the paradoxical “magical thinking” that surfaces despite an allegiance to strict rationalism and empiricism. As it turns out, certain organizations within Silicon Valley, and a portion of their employees, are prone to forming pseudo-religious workspaces that do not necessarily explore the concept of the transcendent but nevertheless have the trappings of a communal group that would.
To stay with Silicon Valley, an organization has even sprung up with the sole purpose of creating new, secular rituals for everyday life. The organization is Ritual Design Lab, and according to its website, the organization researches “the power of rituals to build value, meaning [and] community into our everyday experiences,” lending these services to other organizations in order to help structure their work cultures. Whereas ritualistic participation historically remained within church, community, and family settings, this organization hopes to establish these social, meaning-attributing anchors to the workplace since many are not getting them anywhere else.
Over twenty years ago, the political scientist Robert Putnam publishedBowling Alone: The Collapse and Revival of American Community, in which he unearthed the modern fragmentation of American social bonds to family, friends, and community. Pew Research has shed light on the steady decline of church membership and participation—an activity that fosters the type of communal bonding that Putnam claimed was evanescing, notwithstanding criticisms levied at his book. The problem, however, is that the workplace will never be able to supplant a church, not because it cannot offer a sense of meaning and fellowship—many do, including the university I used to work at—but because its ultimate goal is to produce a good or service and not, ultimately, to know and love a transcendent God in the company of others.
As the American landscape continues to be shaped by technology that fosters digitally mediated relationships, artificial intelligence, and philosophical notions of futurist utopias, the religious impulse will likely continue to find a partial outlet in the workplace—one of the few opportunities for social participation still obligatory for most of us. This means our workplaces will likely continue to move toward offering a communal and holistically engaging environment that looks, sounds, and feels an awful lot like a church, though without offering that key and most fundamental ingredient: humble worship.