3d-rendering-abstract-highway

Software Bots Have Sharing Issues Too

Jeffrey Kok VP, Solution Engineers, Asia Pacific and Japan, CyberArk 

Estimated reading time: 3 minutes

It’s not just humans that are susceptible to clicking on the wrong link or are perhaps a little too cavalier about what they share about themselves. Software bots have sharing issues too, and this Data Privacy Day we highlight how we can better protect the data that they access from being exposed.

Software bots – little pieces of code that do repetitive tasks – exist in huge numbers in organisations around the world, in banking, government and all other major verticals. The idea behind them is they free up human staff to work on business-critical, cognitive and creative work, but also helping improve efficiency, accuracy, agility and scalability. They are a major component of digital business.

The privacy problem arises when you start to think about what these bots need so they can do what they do.  Much of the time it’s access: If they gather together sensitive and personal medical data to help doctors make informed clinical predictions, they need access to it. If they need to process customer data stored on a public cloud server or a web portal, they need to get to it.

Software bots – little pieces of code that do repetitive tasks – exist in huge numbers in organisations around the world, in banking, government and all other major verticals.

We’ve seen the problems that can arise when humans get compromised and the same can happen to bots – and at scale. If bots are configured and coded badly, so they can access more data than they need to, the output might be leaking that data to places where it shouldn’t be. 

Likewise, we hear about insider attacks and humans being compromised to get at sensitive data virtually daily. Machines have the exact same security issues; if they can access sensitive data and they aren’t being secured properly, that’s an open door for attackers – one that can put individuals’ privacy at risk. 

Attackers don’t target humans to get to data, they just target the data.  If machines -especially those in charge of automated processes (think repeatable tasks like bank transfers, scraping web data and moving customer data files) are the best path to take to get to it, that’s the one they will choose.

BYOD, The Sequel: Your Privacy At Risk

One of the many, many ways that life has changed for many of us in the last couple of years is the hybrid work phenomenon. On the plus side we are supporting our local coffee shops and saving money on the commute.  On the minus side we don’t get to see our colleagues and we wear jogging bottoms every day. Oh, and we’re causing data privacy issues for our employers and their customers.

The hybrid work model has changed the privacy game as well. It means that companies have to re-evaluate you data privacy is enforced in 2022. Securing access to sensitive data from remote employees will be big in 2022.

The roots of this are buried in the sudden change of environment that many companies had to provide for. Security policies were written on the premise that there are premises; people might work in a restricted access area or room. They would have workstations that were company-provided and up-to-date with better enforced security policies. 

Covid spurred many privacy issues here. Many workers use home computers that shared for work, personal use and children’s virtual learnings, which does not have the same level of security as corporate managed office workstation.

Likewise, the concept of secure work areas at home is not a realistic one for most of us, with families, flatmates or strangers in the coffee shop peering over our shoulders.

The hybrid work model has changed the privacy game as well. It means that companies have to re-evaluate you data privacy is enforced in 2022. Securing access to sensitive data from remote employees will be big in 2022.