We live in a world where technology surprises us every day with new innovative tools. The marketers, organizers know everything about us thorough data collections. Our religions, love life, work life and many more things about our private lives. This data, mined for our habits, behaviors, likes, and dislikes, is referred to as the “creep factor” of big data.
The Creep factor is the factors or behaves of using our data in a way that consumers find it they have to genuinely protect their privacy. As we think about the future of privacy regulation “The creep factor” idea seems fairly central. When different companies use our data for our benefit, we know it and we are grateful for it. We happily give up our location data to Google or to Yelp or Foursquare so they can help us so they can give us directions. We don’t even mind when they save that data if it helps them make better recommendations in future. The right way to deal with data redlining is to think about the possible damages to the people whose data is being collected, and primarily to regulate those damages, rather than the collection of the data itself, which can also be put to powerful use for those same people’s benefit.
The creep factor also exists when consumers have no control in securing their private life. The creep factor is in the eye of the consumer, and what’s creepy to some may be “cool” to others. The surprising fact is the millennial’s are more comfortable with the notion of rationalization than older generations. The companies should:
Ultimately, the idea is to guide customers through a journey as opposed to coming on strong and invading their personal space. The result is they won’t feel the creep factor; they’ll feel like they had a great customer experience.
There are some creep factors that companies should be careful about them:
Detailing data collection:
We’re a social creature and we love to express our feelings. We observe others and try to gain advantages from what people share. But there’s a point at which data collection can make consumers feel like they’re trapped in a data collection practice. Consumers will do anything to avoid broad (i.e., relating to behavior in multiple contexts, like emailing, texting, web browsing, and voice calling) and granular (i.e., capturing details of the behavior, as in keystroke-logging) kind of practice.
By linking the behavioral data with unique identifiers:
The most powerful ways to deliver targeted ads to consumers to assign a unique identifier to individuals is by tracking consumers’ online behavior across multiple sites, platforms, and apps. There should be behavioral data linked with each unique identifiers.
The assumptions can have negative, positive, or neutral connotations and If the underlying assumptions are negative, consumers will likely find this intrusive.
The sensitivity of data collection:
The sensitivity of data collections is sometimes intolerable. Discussion threads, reviews, feedback these all can’t be disclosed sometimes to marketers. In some cases, consumers find it data violation.
To Impact on operation ability:
This is one issue that courts view as a legally perceptibly harm. If data collection and tracking technology significantly impact the operation ability of users’ computers or mobile devices, as in the case of adware, spyware, and malware, the sense of intrusion can be overwhelming. Consumers will run, not walk, away from these kinds of practices.
Ease of opting out:
This technology can make it virtually impossible for users to opt out of being tracked. Any company using zombie cookies to collect or monetize sensitive information can opt out.
Lack of notice:
Online apps and services may provide various types of notice to users about what’s being done with their data, but it’s safe to say that any OBA data-collection practice conducted with absolutely no consumer notice is seriously disturbing. A good example of this is a practice called “device fingerprinting.” Device fingerprinting creates a unique identifier for computers, cell phones, and other devices based on a combination of externally observable characteristics like installed font styles, clock settings, and TCP/IP configuration. In addition to being problematic because it creates a persistent, unique identifier (see “Creep Factor No. 1”), this information is collected “passively,” and in most instances users can’t even detect that it’s happening.
There are undoubtedly many other “Creep Factors,” and the point is that not all data collections are threat to consumers’ sense of personal privacy. By identifying specific practices likely to be viewed as industry leaders, intrusive, trade organizations, and regulatory bodies may find it easier to determine the level of notice required, or whether some practices should be prohibited outright. These benchmarks may also be useful for companies developing OBA and tracking technologies who want to build sustainable businesses. After all, Creep factors are unappealing to everyone.
For more information about the tech world keep eyes on Right Attitude. Therefore, Right Attitude.