A tech’s true power comes from design Harvard professor thinks. ~ SEAHORSEGEOCITY LINEAGE

SEAHORSEGEOCITY LINEAGE



Saturday, February 24, 2018

A tech’s true power comes from design Harvard professor thinks.

There's great risk attached to providing personal information to tech companies through everyday connections—these days, even as you sleep. Harvard professor Latanya Sweeney says the companies' great power comes not only from the inescapable presence of their creations but from the ability to create fast cultural change to what is acceptable on data privacy—leaving lawmakers and watchdogs dazed in their wake.

It's all a matter of design, Sweeney says.

Consider the Sleep Number bed. Then consider the Apple Watch.

On the surface, they're both health devices—the new Sleep Number beds come equipped with sensors to monitor sleep, and the Apple Watch gathers data on your daily physical activity. But the ways they store the data gathered by monitoring your body are different. While your sleeping patterns are sent to Sleep Number's servers, Apple Watch data are stored only on your personal device or a connected iPhone.

The data describing the way you sleep are sent somewhere you have no control over. It can be sold, shared, or analyzed by anyone the company decides to give access to. As Sleep Number states in its end user agreement:

"You, on behalf of Yourself and any Child or other person that accesses or uses the Services and the System through Your Bed, hereby assign, and shall be deemed to have assigned going forward, to Us complete and sole ownership in and to the Data."

Your Apple Watch data are only shared by you.

Connecting to the problem

Sweeney argues the fact that technology companies are allowed to make the decision to own that data—and data is power in the age of AI—shows we're living in a technocracy, a society now ruled by technology.

"The design of the technology, and how it works is really the new policy," she said yesterday (Feb. 23) at the Fairness, Accountability, and Transparency Conference in New York City. "And the thing about these designs as policymakers is that we didn't vote for them, we didn't elect them, and we didn't have any say in the things that they believed in. And yet, the decisions that they make turn out to be the rules that we have to live by."

Sweeney is one of the foremost experts on how decisions made by technology companies can shape our society. She's conducted research showing the bias of Google's search ads, served as the CTO of the US Federal Trade Commission, and also acts as the director of the Data Privacy Lab at Harvard. Her first experience in the field of data privacy was in 1997, with an experiment to see if she could match anonymized health records with the current Massachusetts governor, William Weld.

She obtained the anonymized health data provided by the state at the time, which cost $20 and came on two floppy disks, and matched the governor's date of admission, gender, and zip code to the matching medical record. With that, the data was re-identified and Sweeney made the match.

Since then, the anecdote is now repeated over and over, as anonymized datasets with any personal information are constantly re-identified. It happened to AOL. And Netflix.

How things replicate

This is why the Sleep Number bed and the Apple Watch stand as stark examples of what one design decision can mean to thousands or millions of people. And when tech companies like Facebook operate in the order of billions, any design decision that results in a privacy risk could be dangerous.

Take Strava, the fitness-tracking company, which recently was accused of leaking sensitive data about paths at US military bases. The company released a heat map that showed popular running routes, but the accuracy of those routes showed normally secret paths within military compounds.

Still, don't expect changes soon—when business models and technology trends are popular, they're widely imitated, further implanting them into society. That's why careful design of any original technology is so important in the first place.

"Technology has this incredible habit of replicating itself," Sweeney said. "Once a design or business practice works, it gets replicated just as it is. The design of the technology really does dictate the rules that we have to live by."

0 comments:

Contact Us

Name

Email *

Message *

Recent Popular Posts

Seahorsegeocity. Powered by Blogger.

Popular Posts