None of that data is sent to server for processing. This includes things like health data.įor example, during a cycling workout, there's information like the user's heart rate, the distance traveled, maybe the user's height and weight, calibration data, and all of that is processed on the paired devices to show what calories were burned during that workout. To reduce the risk, avoid transferring data off device when possible, especially think twice about sensitive data categories. Later Jason will talk about App Transport Security, which is a new way in iOS 9 to make it easier to communicate securely with your services. Now, for the data that you decide is actionable, all data sent off device should be protected in transit. To learn more about the list of techniques and to see examples on how to apply them, go and watch User Privacy in iOS and OS X from last year's WWDC.Īggregation, de-resolution, these are all ways to reduce the risk for the data that you retain.īut which techniques should you apply? The place to start is what is the use of the data? How are you going to use it? What questions are you going to answer? What decisions is this data driving? And if you can't come up with an answer to this question, you can't think of anything, then you shouldn't be retaining this data at all.Īpply the last technique on the list, minimization, and don't collect or transfer the data at all. Now, you can mitigate this risk by applying data minimization techniques. I believe that all data collected carries risk, so you need to balance the value that you provide to users with the risks inherent in collecting and storing data. All data that you are storing and the more data that you store makes you a richer target for attackers, a more valuable target. Now, if you are no longer using that data, it's no longer serving a user need, then you should delete that data. Now, how do you come up with that retention policy? The place to start is how you will use that data. Now, all data should have a retention policy. Now, users trust us with lots of sensitive data.Īnd we need to be good stewards of their data.Īnd architecting for privacy is the place to start to do that. And the trustworthiness of all of you is part of that. Our platform is a place where users have excited about new experiences, excited about downloading new applications.Īnd it's all about keeping our app ecosystem healthy and thriving. We've all read stories in the press about breaches, about misuse of user data, and nobody in this room wants to be the next one in the news. Not only at Apple are we focused on building great products, we also are focused on building great tools for you, the developer community, to make it easy for you to respect user privacy and build privacy into your apps and services.Īt the end of the day, all of our success relies on our relationship with our users.Īnd trust is key to maintaining that relationship. So when you are building your apps, be mindful of user privacy, and build privacy into your apps. Users want their privacy respected when they use our products, and all developers, everyone in this room, shares that responsibility. That's a guiding principle that we take every day into how we design our apps, our services, and new versions of iOS, OS X, and watchOS. At Apple, we see privacy as a human right. A few of the teams that I work closely with are Apple Pay, Siri, Proactive Assistant, Health, and our newest OS, watchOS.įirst I am going to talk a little bit about why privacy matters at Apple and how we think about it. That means we work with teams all across Apple to build privacy into our apps and services. We are both members of Product Security and Privacy at Apple. My name's Katie Skinner, and along with Jason Novak, we are going to be talking about privacy in your app this afternoon.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |