Ubiquitous Computing and Privacy
From Gothpoodle
Computers, sensors, microcommunicators, and tiny batteries are as cheap as toilet paper, and can be imprinted directly onto most surfaces. Processors are invisible, hidden in infrastructure and in almost every device. Humans exist in an invisible web of infrared, laser, and radio signals. Material goods from sneakers to bricks are “smart,” capable of exchanging data.
Hungry? Slide a TV dinner in the oven. Your microwave scans the package’s nanodot code or v-tag and selects the setting. When it’s done, the oven signals your virtual interface, which pages you. Prefer to do your own cooking? Your virtual interface contacts the Web and downloads a menu. If you left the groceries in their original packaging, it scans the labels and checks them off as you pick them up. Can’t find something? Ask the fridge. “Pizza shells in the first shelf,” it tells you. You pull out the anchovies instead of the pineapple. Your virtual interface scans the label on the can. “Those are anchovies,” its inventory agent warns. “Try again.”
Ubiquitous monitoring is possible. Tiny “video cameras on a chip” and microbot imaging or audio devices can monitor public spaces. The information can be processed due to the proliferation of AI systems, who may be instructed to inform authorities if anything interesting or troubling occurs: medical emergencies, criminal or subversive acts, etc. Parties such as news media or investigators sometimes employ similar technology on a local scale.
n practice, this sort of “Big Brother” monitoring is restricted to a few states or colonies with authoritarian regimes. Most countries have passed “public privacy” laws that restrict microbot or nanocamera monitoring of public space to certain designated areas, such as around government buildings or during special events. Otherwise, authorities are often required to apply to a judge to place an area (such as a crime-ridden neighborhood) under public surveillance, much as if they were requesting a search warrant. “Black ops” agencies may “forget” to ask for permission, which can lead to scandal when discovered.
Some places permit monitoring but have strict restrictions on when information can be accessed and by whom. Singapore is an example; it has extensive urban surveillance but equally strict laws on when and how humans can access the AI-filtered databases.
A few places, especially those with libertarian societies, have no government monitoring (or no government), but let investigators, concerned citizens, media, and anyone else freely monitor public spaces. Individuals can set up whatever privacy they can afford for their own persons and property. Residents who can get the neighbors to agree with them may declare a “privacy zone,” and hire security firms to use swarms of microbot bug-hunters or other antisurveillance systems to sweep the streets clean of annoying snoops.
Despite these measures and regulations, public privacy is fundamentally limited because just about every person has a virtual interface with a built-in camera capable of recognizing faces and analyzing whom they belong to (see Augmented Reality). However, at least you can spot them at the same time. Children (and often bioroids) have less privacy, since a parent’s ability to monitor dependents is rarely legally restricted. It is common to fit them with biomonitors, give them virtual interfaces whose infomorphs owe a higher allegiance, or give them an allowance with coded limits on how much virtual cash can be spent where. This can result in a barter economy among kids...
Workers often have more privacy than they did in the 20th century, since a greater percentage are contractors working at home. Even so, as they usually paid for results, rather than by the hour, contractors working in service jobs are often monitored to a greater or lesser degree: supervisory software tracks the number of customers served, people’s responses, and so on. Other jobs pair human workers with low-sapient or sapient AI “partners.” These may be valued co-workers and even liked, but in jobs where the company provides them, no one forgets whom they actually report to. The degree of AI supervision is often the subject of vigorous contract negotiation.
Ubiquitous monitoring is quite possible in private spaces, although employee contracts and labor law may limit its extent. Secure installations such as military bases or laboratories are often heavily monitored, but there may be strict limits on who has sufficient security clearance to view the reports.