In today’s increasingly digital world, have you wondered why people don’t just automatically accept new technology when it has become useful in so many ways? Before my career in academia, I worked in the IT field as a database architect and programmer. My work led me to ask this question—is it the technology or is it the people?
Both in everyday life and in the workplace, people prefer to use technology they are familiar with. Forcing them to change or adapt to something new can be uncomfortable. Businesses are realizing the challenges associated with integrating new technologies, which are designed to make processes and performance management more efficient but can lead to problems when first rolled out to employees. By applying the following insights, businesses can better understand why certain technologies fail and how to engage employees in the process.
Technology as an extension of ourselves
Historically, we introduced technology first in the workplace and then applied it to our personal lives. Now, the internet, social media, smartphones and other intelligence devices, such as Amazon’s Alexa and Apple’s Siri, have made technology more accessible—and more connected—than ever before. Thus, companies are finding they need to adjust to the technology people already use.
There’s a reason people, particularly youth, seem addicted to their screens—because technology empowers them. Consumer technologies are now viewed as an extension of our own capabilities. We can go anywhere with the help of GPS, answer any question thanks to Google, and collaborate and crowdsource from across the globe. More importantly, we use technology to understand who we are and our place in the world.
This is why companies like Apple and Facebook are adept at designing technology people will use. Rather than focusing on the technology itself, they focus on the user experience and the ways people draw from that experience to connect with the world around them. Similar to a “choose your own adventure” video game, users can interact with the technology in ways that support their identities.
Driving behavior change
Oftentimes when projects fail in the workplace, it’s not because the technology is faulty but because employees don’t want to use it. People who are forced to adopt a new technology may start finding workarounds and go back to what they’re accustomed to. With this in mind, companies should get a sense of how employees identify with technologies they already use and then develop incentives and mechanisms to shift that identification over time.
It’s unlikely a two-week training period—which companies often do when rolling out something new—will drive the desired behavior change. Instead, companies must find creative ways to make the technology their employees already use less attractive, thus making them less likely to identify with it. There are various ways to go about this, from rewarding employees for using the new technology, while not rewarding use of an existing technology, to changing employees’ job descriptions to include use of the new technology.
The way people view themselves is a great predictor of their behavior. In order to identify with new or different technology, it’s important for people to feel a sense of accomplishment when using it. When the technology allows them to explore something new or solve problems, they are more likely to engage in exploratory behavior and begin to identify with the technology on a personal level.
Taking an inclusive approach
Technology has changed the world forever, but that doesn’t mean we should leave people behind in the process. Sometimes an expectation to keep up with the latest technology ends up marginalizing those who don’t have access or don’t understand it, such as low-income communities or the elderly. Companies must also recognize that not everyone identifies with technology—some consider it a necessity, while others may feel incompetent and unable to adapt.
In Seattle, we’ve experienced a major technology boom that has led to great prosperity for some but displacement for others. As tech companies expand and more people move into the area, challenging issues related to rapid growth have emerged, such as increased traffic congestion, higher costs of living and homelessness. Business plays a role in this ecosystem, so we must be cognizant that technological advancements aimed at solving one problem may in fact lead to others.
In general, society needs to be more inclusive in its approach to technology. It’s not realistic for companies to simply swap out one technology for another and expect immediate buy-in. So, it’s increasingly important to understand how different groups identify with technology and use it to navigate the world. And when introducing something new, businesses must both present the need for the new technology as well as give employees a reason to connect with it.
Michelle Carter, Ph.D., serves as assistant professor in the Department of Management, Information Systems, and Entrepreneurship at Washington State University, based in Everett, Wash. Her current research investigates the involvement of IT in identity, humanness and social change in an increasingly digital world. Michelle also serves as president for the Association of Information Systems (AIS) Special Interest Group on Social Inclusion (SIGSI).