GDPR Compliance

The whole idea of GDPR is people shouldn’t get rich of the back of others selling/collecting data they shouldn’t.  Note: Cambridge Analytica and the current fallout of that.  Looks like there is also real pressure for the US to accept GDPR as default so the scope widens.

What worries me is the complete lack of disregard of the legal position Corona is putting its developers in.  Saying that I’m sure they can just simply disable their collection endpoint easily as they would surely not want their customer base to suffer for this?

I am also sure that they don’t want this publicly visible which it currently is… oh look here is unity’s public statement on GDPR https://unity3d.com/legal/gdpr.

It is high time Corona accept responsibility for devs that have based their businesses on the platform. 

So crash on ignoring your customer base Corona…  My legal team informs me than I am well within my EU rights to sue Corona for any punitive action that is based on their non-compliance + a fair amount for damages and any disruption caused.  I am more than happy to coordinate a class action lawsuit for anyone affected if Corona does not act.

Yeah, I take this real serious (it is my business after all) and I strongly suggest you do too.

I agree with most of what you’re saying (except from the class action stuff against corona, come on, let’s be constructive)  

But as you’re bringing Unity up here - just want to add that the situation with unity is very similar. There’s no clarity or information from Unitys side on what needs to be done as a developer using Unity Analytics (Which you’re forced to use if you’re using iAPs in unity). And Unity aren’t responding to forum threads about it. (Same thing with Unity Ads). 

https://forum.unity.com/threads/unity-analytics-and-gdpr.513112/

OK, in an effort to be more constructive here is a helpful list of things that we as app developers need to consider for GDPR compliance… (This includes any third party that has access to any data including Corona’s automatic data harvesting).

For definition, personal data now includes IP addresses and all advertising identifiers - basically any data that is not anonymized.

1. Determine whether the app really needs all the requested personal data

The ideal privacy implementation saves as little personal data as possible, such as birth date, name, country of residence, etc. This is not possible in all cases; some entities will need more information. In all cases, though, developers and management should define exactly which data is absolutely necessary. 

2. Encrypt all personal data and inform users about it

If an application needs to save personal information, the data should be encrypted with proper and strong encryption algorithms, including hashing. In the Ashley Madison data breach, all personal data was in clear text, which had huge consequences for its users. It should be explicitly stated to users that all their personal data, including phone numbers, country of residence, and address, will be encrypted and hashed to avoid any form of data extraction and potential exposure in case of a data breach.

3. Think OAUTH for data portability

Protocols for single sign-on such as OAUTH allow users to create accounts by simply providing another account, but they also assure that no personal data other than the authentication ID from the other service is stored.

4. Enforce secure communications through HTTPS

Many entities do not use HTTPS for their websites because they do not consider it necessary. For example, if the application does not require any form of authentication, then HTTPS might not seem needed. But it’s easy to overlook some things. For instance, some applications collect personal information through their “contact us” forms. If this information is sent in clear text, it will be exposed through the Internet. Also, you should make sure that the SSL certificate has been properly deployed and is not exposed to vulnerabilities related to SSL protocols.

5. Inform users about and encrypt personal data from ‘contact us’ forms

Applications do not collect information only through authentication or subscription, but also through contact forms. Most of this information is personal, including email address, phone number, and country of residence. Users must be informed how this data will be stored and for how long. The use of strong encryption is highly recommended for storing this information.

6. Make sure sessions and cookies expire and are destroyed after logout

Users must have proper visibility about the use of cookies by the application. They must be informed that the application is using cookies, the application should provide the opportunity for users to accept or deny cookies, and cookies must be properly destroyed after inactivity or logout.

7. Do not track user activity for business intelligence 

Many e-commerce applications on the web track users to determine their tastes through their searches or products bought. Often, companies such as Amazon and Netflix use this sort of information for their recommender systems. Because users’ personal taste and choices are being monitored and stored for commercial purposes, the users should be able to accept or reject this as an option. If users decide to accept such tracking, they should then be told how the data is saved in the system and for how long. And, of course, anything related to personal information should be encrypted.

8. Tell users about logs that save location or IP addresses 

Many applications use IP addresses or locations as a parameter to control authentication and authorizations, and they log this information in case someone attempts to bypass authentication controls. Users should be told about this, as well as how long the logs will be saved in the system. Never include more sensitive information such as passwords in the logs.

9. Store logs in a safe place, preferably encrypted

Keep any logs that contain user information in a safe place and inform users about what happens to these logs: how they are stored and how long are they retained. The logs themselves should be encrypted.

10. Security questions should not turn on users’ personal data

In many applications, security questions are used as a form to confirm the identity of a user. These questions should not include personal components such as mother’s maiden name or even the user’s favorite color. If possible, replace these questions with two-factor authentication. If that isn’t possible, let users create their own questions and warn them against creating questions that contain personal data. Any information provided should be encrypted.

11. Create clear terms and conditions and make sure users read them

Don’t hide away your terms and conditions. Under the new EU privacy laws, terms and conditions should be on the landing page of any web application and be highly visible at all times while the user navigates the application. An enforcement mechanism is necessary so that users have to agree to terms and conditions before being allowed to access to the app, especially when terms have been changed. The terms and conditions should also be in language that is easily understood.

12. Inform users about any data sharing with third parties  

If your organization shares personal data with third parties, whether they are external plugins, affiliates, or government organizations, that fact should be included in the terms and conditions.

13. Create clear policies for data breaches

One of the most important aspects of the EU law is the right of users to be informed if a data breach occurs. Organizations must implement clear policies that establish roles and steps to follow so that, for example, users are promptly informed about any breach.

14. Delete data of users who cancel their service

Many web applications do not make it clear what happens with personal data after a user has canceled the service or deleted an account. With the right to be forgotten, companies should respect the right of users to delete all their account information and related data. It must be visible to users that they can leave a service and all their data will be deleted. Companies that treat deleted accounts as merely inactive could run afoul of the law.

15. Patch web vulnerabilities 

As mentioned on the OWASP Top 10 list, one of the major data privacy risks involves web application vulnerabilities: “Vulnerability is a key problem in any system that guards or operates on sensitive user data. Failure to suitably design and implement an application, detect a problem or promptly apply a fix (patch) is likely to result in a privacy breach.” Make sure your organization has a program in place to assess cyber risks and do penetration tests and patches effectively.

And here is an interesting read according to Game Analytics what personal data means:

According to GDPR, personal data is:

“Personal data is any information that relates to an identified or identifiable living individual. Different pieces of information, which collected together can lead to the identification of a particular person, also constitute personal data.

Personal data that has been de-identified, encrypted or pseudonymised but can be used to re-identify a person remains personal data and falls within the scope of the law.”

This means that not only is personally identifiable information like the user’s name, email address, or device ID (IDFA/GAID) personal data, but any data we can associate with one person, even if we cannot identify that person in the real world.

The most important consequence of this is that any data associated with one individual (or an ID referring to one individual, even if it is a randomly generated ID) is personal data – including actions they have taken in a game, such as starting the tutorial, picking a character, beginning or ending a session.

 

  • For players the game developers must ask for consent when the game opens, before any data has been sent to us (or to other data controllers and processors). The consent they ask for from their players must include that their data will be used for analytics and marketing purposes. Most game developers should also have publicly available privacy policies and terms of service that can be reviewed by users.

I doubt many will consent to their data being used for “analytics and marketing purposes”.  So it is game over for that service then.

We’re working within our team to identify data we’re collecting we will publish a statement when we are ready. We will make sure that apps are compliant with GDPR from our side.

We are aware of the issue and working both on becoming GDPR compliant ourselves (Simulator & Native) and making sure Corona is not in a way of making your apps compliant.
 
So, yes, [member=‘SGS’], for now we can release same statement as Unity did: we’re looking into it, assessing data we collecting and trying to decide if we’re compliant in handling it.

I just checked what data simulator sends (latest daily build) with proxy:

On windows it sends requests only to build server or on authorization. On macOS it additionally sends some basic analytics events about user activity. I don’t see any of the servers you posted on a screenshot, SDS. May I ask to try latest daily build and tell if you still see the same thing?

@sgs: Most analytics providers fall into the category if they are free. Flurry, for example, has the same type of consent request. Flurry if you have the newer version of the SDK also has a built-in UI to ask for consent for Flurry only. I can’t imagine having my app open up 4 consent requests for each API before starting.

In my opinion, many small app developers will simply ignore the requirements of GDPR and hope they don’t get caught. Big developers will have the money and time to fully meet the regulation.

From my part, I am going to thin out the SDKs that I use to a minimal. As for consent, I’ll ask the user at the point that I need to either for leaderboards or for pulling up preferences. I’ll store it locally and check at launch and if it has expired I’ll ask for it again. 

OK so here is 3268…

li903.30.members.linode.com maps to corornalabs.com - IP is 45.56.101.30

s20522443.onlinehome-server.info maps to one of my servers

lhr35s02-in-fe4.1e100.net is a google domain - not sure on this one?

edge-star-shv-01-lht6.facebook.com maps to Facebook Ireland - again unsure why sim would be connected to Facebook?

Summary

My main concern is Corona is my only weak point - every other plugin I use is GDPR compliant. 

According to your own privacy policy you are collecting IP addresses (amongst other data) and that is private data as far as GDPR is concerned.

The simple solution for us all is Corona stops background harvesting of data.

@agramonte Google Analytics doesn’t so I recommend using that instead.  I cannot comment on the plugin as I go direct to Google Analytics so therefore I control what is sent.

Looks like Unity are tracking quite a lot of stuff too - https://twitter.com/glassbottommeg/status/986635257242796032

@SGS, if Google Analytics store the REST request IP addresses, don’t you still need a way to address that in your privacy policy to avoid violating GDPR?

Like this https://developers.google.com/analytics/devguides/collection/protocol/v1/parameters#aip

I didn’t know about that. Thank you :slight_smile:

I only send session start and session end events.  I used to track more granular data but quickly ran into the free limits of Analytics. 

They say 10 million per month but really started complaining at around 60 million per month.

Anonymizing IPs is necessary, but not enough. You can’t have a unique ID per user, not even a random number, because that now qualifies as PII. 

We are considering sharing a smaller number of random IDs (‘buckets’) among a large group of users in order to learn usage trends while not being able to pinpoint any one user’s behavior. We will be missing out on some measurements e.g. anything involving uniques, while still being able to see overall figures.

Anyone have comments on this approach? How big should a bucket be to consider the data non PII?

@studycat what is the source for “You can’t have a unique ID per user, not even a random number, because that now qualifies as PII”?

As just about everything will require some form of unique identifier?  For sure, any game with an online element will need this.

I don’t track user activity - only on and off - so I hope I am OK.

Any ID what you can track and go back see what that person did is personal data and requires explicit consent from that user. That includes ip, both ad id and vender id from apple as well as any generated Id you create. This is from the regulation document. Name, Id number, location data or…

Rec.26; Art.4(1)

“Personal data” means any information relating to an identified or identifiable natural person (“data subject”); an identifiable person is one who can be identified, directly or indirectly, in particular by reference to an identifiersuch as a name, an identification number, location data, online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that person.

I think maybe the hype (and threat of massive fines) somewhat clouds the judgement.  We are not Facebook and we do not sell our players data.  As long as we stick to the general principles I think we are fine.

I imagine they will take a top down approach and go for the bigger fish first.  We are minnows and we can feel safe in our little rock pool.

Lets see how the big games handle things… If they generally pop a message about opting in then we should follow.

I agree that we won’t be sued first and we have some time to see what others do. I am fairly sure that AdColony will be one of the first to be sued since they claim that their data collection is essential. From their FAQ.

AdColony will not request nor require consent from a user in order to display advertisements. We believe that our legitimate interest is appropriate given the value we bring to sustaining a healthy ecosystem amongst users, advertisers, and publishers after having conducted a legitimate interest assessment.