According to the privacy policy, a chunk of data is being collected and said to be provided to the developers (by the way, this makes them the “processor” according to GDPR) but we don’t know where this data is and how it’s used. This may cause some problems in the near future when GDPR kicks in.
Hey all, I’m trying to get an answer on this. Please be patient.
Rob
I don’t have an answer yet. I’m pressing the team to find out, but they are incredibly busy right now.
Rob
A 20 million euros fine would probably sharpen their focus. There will be people out there looking to test and profit from the system from day one, it needs to be taken seriously and ready to go on May 25th.
I hope they are busy with GDPR compliance or as nick_sherman says, things can go sideways.
Following this. We can remove analytics and ad networks plugins if need to be GDPR compliant, but if Corona is doing something natively we need to know what and how to turn it off. I don’t want to have to put a disclaimer at the load of every app saying we collect information.
Yes, especially since user must be able to opt-out.
@Rob we need the official stance on this sooner rather than later. Time is marching on this for sure. This post was opened March 21st if an answer takes much longer then we will be at the deadline and that leaves zero time for us to plan a strategy, code it, test it and roll it out.
Therefore, either give us full access to the information you collect or stop collecting it immediately.
I am sure you are only too aware of the recent backlash on hidden data collection based on poor practises by Facebook.
This is a legal requirement for any apps selling into the EU and not something up for debate.
I know it’s a legal issue but the answer about those statistics should be simple. If you say “we’re providing our developers with usage statistics”, we should be able to see it somewhere.
I can’t help but think that the privacy policy is dated back to Corona Analytics days and the person responsible for that just forgot to remove references to it.
The fact this thread has been going for nearly a month with no new info is disconcerting. From how I understand GDPR, if Corona is sending user info to a remote server and developers don’t have the ability to stop it, then every single Corona app ever made is in violation of GDPR. Basically no Corona apps could be sold in Europe without breaking the law.
Corona needs to provide developers a way to simply stop this collection. We have 20+ apps and if we have to rebuild and resubmit every app before the deadline then Corona needs to address this ASAP. Ideally we will be able to stop it remotely without the need for a rebuild, but that may be hoping for too much.
I think if Corona does not fix this ontime, I think in the google play console and maybe apple, there will be an option asking whether your app is GDPR compliant. We may have to not release our apps in Europe until we are compliant.
The whole idea of GDPR is people shouldn’t get rich of the back of others selling/collecting data they shouldn’t. Note: Cambridge Analytica and the current fallout of that. Looks like there is also real pressure for the US to accept GDPR as default so the scope widens.
What worries me is the complete lack of disregard of the legal position Corona is putting its developers in. Saying that I’m sure they can just simply disable their collection endpoint easily as they would surely not want their customer base to suffer for this?
I am also sure that they don’t want this publicly visible which it currently is… oh look here is unity’s public statement on GDPR https://unity3d.com/legal/gdpr.
It is high time Corona accept responsibility for devs that have based their businesses on the platform.
So crash on ignoring your customer base Corona… My legal team informs me than I am well within my EU rights to sue Corona for any punitive action that is based on their non-compliance + a fair amount for damages and any disruption caused. I am more than happy to coordinate a class action lawsuit for anyone affected if Corona does not act.
Yeah, I take this real serious (it is my business after all) and I strongly suggest you do too.
I agree with most of what you’re saying (except from the class action stuff against corona, come on, let’s be constructive)
But as you’re bringing Unity up here - just want to add that the situation with unity is very similar. There’s no clarity or information from Unitys side on what needs to be done as a developer using Unity Analytics (Which you’re forced to use if you’re using iAPs in unity). And Unity aren’t responding to forum threads about it. (Same thing with Unity Ads).
https://forum.unity.com/threads/unity-analytics-and-gdpr.513112/
OK, in an effort to be more constructive here is a helpful list of things that we as app developers need to consider for GDPR compliance… (This includes any third party that has access to any data including Corona’s automatic data harvesting).
For definition, personal data now includes IP addresses and all advertising identifiers - basically any data that is not anonymized.
1. Determine whether the app really needs all the requested personal data
The ideal privacy implementation saves as little personal data as possible, such as birth date, name, country of residence, etc. This is not possible in all cases; some entities will need more information. In all cases, though, developers and management should define exactly which data is absolutely necessary.
2. Encrypt all personal data and inform users about it
If an application needs to save personal information, the data should be encrypted with proper and strong encryption algorithms, including hashing. In the Ashley Madison data breach, all personal data was in clear text, which had huge consequences for its users. It should be explicitly stated to users that all their personal data, including phone numbers, country of residence, and address, will be encrypted and hashed to avoid any form of data extraction and potential exposure in case of a data breach.
3. Think OAUTH for data portability
Protocols for single sign-on such as OAUTH allow users to create accounts by simply providing another account, but they also assure that no personal data other than the authentication ID from the other service is stored.
4. Enforce secure communications through HTTPS
Many entities do not use HTTPS for their websites because they do not consider it necessary. For example, if the application does not require any form of authentication, then HTTPS might not seem needed. But it’s easy to overlook some things. For instance, some applications collect personal information through their “contact us” forms. If this information is sent in clear text, it will be exposed through the Internet. Also, you should make sure that the SSL certificate has been properly deployed and is not exposed to vulnerabilities related to SSL protocols.
5. Inform users about and encrypt personal data from ‘contact us’ forms
Applications do not collect information only through authentication or subscription, but also through contact forms. Most of this information is personal, including email address, phone number, and country of residence. Users must be informed how this data will be stored and for how long. The use of strong encryption is highly recommended for storing this information.
6. Make sure sessions and cookies expire and are destroyed after logout
Users must have proper visibility about the use of cookies by the application. They must be informed that the application is using cookies, the application should provide the opportunity for users to accept or deny cookies, and cookies must be properly destroyed after inactivity or logout.
7. Do not track user activity for business intelligence
Many e-commerce applications on the web track users to determine their tastes through their searches or products bought. Often, companies such as Amazon and Netflix use this sort of information for their recommender systems. Because users’ personal taste and choices are being monitored and stored for commercial purposes, the users should be able to accept or reject this as an option. If users decide to accept such tracking, they should then be told how the data is saved in the system and for how long. And, of course, anything related to personal information should be encrypted.
8. Tell users about logs that save location or IP addresses
Many applications use IP addresses or locations as a parameter to control authentication and authorizations, and they log this information in case someone attempts to bypass authentication controls. Users should be told about this, as well as how long the logs will be saved in the system. Never include more sensitive information such as passwords in the logs.
9. Store logs in a safe place, preferably encrypted
Keep any logs that contain user information in a safe place and inform users about what happens to these logs: how they are stored and how long are they retained. The logs themselves should be encrypted.
10. Security questions should not turn on users’ personal data
In many applications, security questions are used as a form to confirm the identity of a user. These questions should not include personal components such as mother’s maiden name or even the user’s favorite color. If possible, replace these questions with two-factor authentication. If that isn’t possible, let users create their own questions and warn them against creating questions that contain personal data. Any information provided should be encrypted.
11. Create clear terms and conditions and make sure users read them
Don’t hide away your terms and conditions. Under the new EU privacy laws, terms and conditions should be on the landing page of any web application and be highly visible at all times while the user navigates the application. An enforcement mechanism is necessary so that users have to agree to terms and conditions before being allowed to access to the app, especially when terms have been changed. The terms and conditions should also be in language that is easily understood.
12. Inform users about any data sharing with third parties
If your organization shares personal data with third parties, whether they are external plugins, affiliates, or government organizations, that fact should be included in the terms and conditions.
13. Create clear policies for data breaches
One of the most important aspects of the EU law is the right of users to be informed if a data breach occurs. Organizations must implement clear policies that establish roles and steps to follow so that, for example, users are promptly informed about any breach.
14. Delete data of users who cancel their service
Many web applications do not make it clear what happens with personal data after a user has canceled the service or deleted an account. With the right to be forgotten, companies should respect the right of users to delete all their account information and related data. It must be visible to users that they can leave a service and all their data will be deleted. Companies that treat deleted accounts as merely inactive could run afoul of the law.
15. Patch web vulnerabilities
As mentioned on the OWASP Top 10 list, one of the major data privacy risks involves web application vulnerabilities: “Vulnerability is a key problem in any system that guards or operates on sensitive user data. Failure to suitably design and implement an application, detect a problem or promptly apply a fix (patch) is likely to result in a privacy breach.” Make sure your organization has a program in place to assess cyber risks and do penetration tests and patches effectively.
And here is an interesting read according to Game Analytics what personal data means:
According to GDPR, personal data is:
“Personal data is any information that relates to an identified or identifiable living individual. Different pieces of information, which collected together can lead to the identification of a particular person, also constitute personal data.
Personal data that has been de-identified, encrypted or pseudonymised but can be used to re-identify a person remains personal data and falls within the scope of the law.”
This means that not only is personally identifiable information like the user’s name, email address, or device ID (IDFA/GAID) personal data, but any data we can associate with one person, even if we cannot identify that person in the real world.
The most important consequence of this is that any data associated with one individual (or an ID referring to one individual, even if it is a randomly generated ID) is personal data – including actions they have taken in a game, such as starting the tutorial, picking a character, beginning or ending a session.
- For players the game developers must ask for consent when the game opens, before any data has been sent to us (or to other data controllers and processors). The consent they ask for from their players must include that their data will be used for analytics and marketing purposes. Most game developers should also have publicly available privacy policies and terms of service that can be reviewed by users.
I doubt many will consent to their data being used for “analytics and marketing purposes”. So it is game over for that service then.
We’re working within our team to identify data we’re collecting we will publish a statement when we are ready. We will make sure that apps are compliant with GDPR from our side.
We are aware of the issue and working both on becoming GDPR compliant ourselves (Simulator & Native) and making sure Corona is not in a way of making your apps compliant.
So, yes, [member=‘SGS’], for now we can release same statement as Unity did: we’re looking into it, assessing data we collecting and trying to decide if we’re compliant in handling it.
I just checked what data simulator sends (latest daily build) with proxy:
On windows it sends requests only to build server or on authorization. On macOS it additionally sends some basic analytics events about user activity. I don’t see any of the servers you posted on a screenshot, SDS. May I ask to try latest daily build and tell if you still see the same thing?
@sgs: Most analytics providers fall into the category if they are free. Flurry, for example, has the same type of consent request. Flurry if you have the newer version of the SDK also has a built-in UI to ask for consent for Flurry only. I can’t imagine having my app open up 4 consent requests for each API before starting.
In my opinion, many small app developers will simply ignore the requirements of GDPR and hope they don’t get caught. Big developers will have the money and time to fully meet the regulation.
From my part, I am going to thin out the SDKs that I use to a minimal. As for consent, I’ll ask the user at the point that I need to either for leaderboards or for pulling up preferences. I’ll store it locally and check at launch and if it has expired I’ll ask for it again.