Caring for our personal data requires as much effort as managing our finances, maybe even more. A joint research conducted by Maldita.es, Tactical Tech and SocialTIC analysed how financial technology (fintech) apps focused on personal budgeting handle their users’ data privacy, but also how users themselves manage their personal information inside these apps.
Research and reporting by Maldita.es, Tactical Tech and SocialTIC
- We used the EU's GDPR (General Data Protection Regulation) mechanisms to ask five fintech companies for the information they store about some of their users.
- We worked with volunteer "data donors" to collect such information.
- Data we obtained showed details such as money spent on drugs, alcohol, travel or birth control pills, all stored in their applications.
- This information feeds algorithms that seek to create a user's credit profile, personal data profile or gets shared for advertising purposes, and is sent to other countries not covered by the GDPR, such as the United States.
- We discussed with experts about the challenges and risks that companies' access to such data may mean for users on the short and long term.
Caring for our personal data requires as much effort as managing our finances, maybe even more. A joint research conducted by Maldita.es, Tactical Tech and SocialTIC analysed how financial technology (fintech) apps focused on personal budgeting handle their users’ data privacy, but also how users themselves manage their personal information inside these apps. Our goal was to understand how privacy obligations are implemented and whether personal data that European users input in such apps - because they are requested to upon sign up, or because they want to - is shared beyond the app, beyond Europe, and to what extent.
What we found was quite intriguing:
- a company handed us data about the wrong user;
- an app refused initially to send us data as required by law;
- evidence of aggressive trackers accessing users’ personal data, and the use of automated user profiling mechanisms.
Personal budgeting opens a new scenario in which people do not only manage their finances but are also willing to disclose their habits, cravings, guilty pleasures and lifestyles as if they were using a social media platform, while not being entirely aware of the consequences. Users may openly share with digital services such intimate details as the money they spend on alcohol or drugs or if they have any medical indications, like contraceptives.
These personal behaviours reflected in individual spending are then shared with third-party platforms and companies - often tech giants like Facebook, Google and Amazon, but also with many hungry trackers meant to digest data, build user profiles and serve tasty advertising tailored to preferences we did not even know we had. Yes, that means our mother does not know how much money we are spending on parties but a random advertiser from a different country or a social media platform might, and can use that information to build our accurate profile (if you read Spanish, also check this article from Maldita.es “Por qué la publicidad online sabe lo que te quieres comprar” / “Why online advertising knows what you want to buy”.)
From the personal finance apps available to users in the European Union, we took a closer look at five services: four budget control apps - Fintonic, You Need a Budget (YNAB), Splitwise and Tricount - and Revolut, a mobile banking application that has also been used as an expense management app in countries like Romania. The selection of these applications was not arbitrary. They were among the most downloaded personal budgeting and finance control apps as of the end of 2021 across Spain, France, Germany or Romania – countries we looked into when carrying out the research. They all offer personal finance and budgeting services with different functionalities, from sharing expenses with colleagues, friends or strangers to setting up a detailed budget to save money for a trip, a master’s degree or even retirement.
There are no clear guarantees that financial and other personal data that apps and the companies behind them collect from users is being treated with sufficient care. Thorough our research into the privacy policies and personal data handling by budgeting apps available in Europe, we have observed that companies are not fully transparent with the way they are processing users’ data. Moreover, they often do not comply with their own rules and sometimes keep personal information longer than required. For instance, we encountered a case in which personal data from a random user was disclosed to us without the user being aware, and was kept beyond the term limits even though they had stopped using the app years ago.
Through personal data subject requests under the right of access to digital subject access requests (DSAR) provided by the EU General Data Protection Regulation (GDPR), a technical analysis of the selected apps and a careful look at their privacy policies, we identified issues that shed light on the extent to which citizens may be vulnerable in contexts where they expect to have their sensitive data and their digital rights attended to and respected.
Under the right of access (DSAR), 15 people with different socio-demographic characteristics who used at least one of the five apps authorized us to request a copy of all their personal data collected and stored by the companies operating these apps.
This report summarises our main findings as well as insights offered by experts in the field of data protection.
In a nutshell
The apps we analysed - and many more like them - are part of a thriving fintech (short for 'financial technology') industry. This industry is animated by start-ups that have merged the fields of finances and digital technology to ease and expedite the way institutions and people deal with all kinds of financial transactions. On one hand, this relatively new field has allowed people - the 'end users' - to handle their own finances; on the other hand, it has revolutionised traditional corporate finances and money flows.
There is a widespread interest among investors to support fintech and its start-ups, in a race to diversify financial services, cover more countries and recruit more users. For instance, banking app Revolut and the group of companies behind it raised 800 million USD in 2021, expense sharing app Splitwise raised 20 million USD in investments in 2021, while banking and budget management app Fintonic received a 21.4 milion USD investment in 2019.
|Company||Splitwise Inc.||You Need A Budget LLC||Tricount||Fintonic Servicios Financieros SL||Revolut Ltd.|
|Registered in||Delaware, United States (based in Rhode Island, U.S.)||Utah, United States||Brussels, Belgium||Madrid, Spain||London, United Kingdom|
|Installs on Google Play*||Google Play: 10,000,000+||Google Play: 1,000,000+||Google Play: 5,000,000+||Google Play: 1,000,000+||Google Play: 10,000,000+|
|Further details||Splitwise profile on Crunchbase||YNAB profile on Crunchbase||Tricount profile on Crunchbase||Fintonic profile on Crunchbase||Revolut profile on Crunchbase|
Summary table: basic data on the analysed apps. Numbers of Google Play installs at global level (where the apps are available) according to the apps' profiles on Google Play, as verified on 4 March 2022. For more detailed install data from App Store and other sources, check updated statistics from analytics services like Sensor Tower, App Figures, etc.
In terms of access to real time financial data, [Fintonic and YNAB give users the ability to link their bank accounts with the app service so that income and expenses are automatically linked to a budget plan.
UK-based start-up Revolut is in itself a digital banking app, so its budget control services have constant access to a user's bank details as well as connecting them to a range of other Revolut companies operating insurance services, marketing, travel planning, investment services (cryptocurrency, stock trading, etc) and more. In addition, because of its due diligence requirements, Revolut collects a large amount of personal information and records - about the user and their spouse and family - from third-parties such as credit-reference agencies, financial or credit institutions, official registers, as well as fraud-prevention agencies, etc. It also informs users that it may give their data, in return, to social media and marketing firms for advertising purposes, to credit-ranking agencies to make sure you are fit for a credit or to law enforcement to check whether you may be suspected for any fraud. Therefore, users planning to open an account for just minimum personal budget planning may want to consider these conditions and whether they balance out their needs.
Machines looking into your data
Fintonic is considered the main app available in Spanish for money management, and it also operates as a digital bank since 2019 when it received its licence from the Bank of Spain. Since it was launched in 2012, it has added multiple functionalities, like giving users the possibility to contract insurance or apply for a loan with another company. It also provides interested users with its own spending card. These procedures can be done directly from the mobile application, which acts as a money organizer.
Image: Fintonic app screenshot. Source: Maldita.es. (Spanish to English translation: “FinScore: loans and cards. 830 - the score of your financial capacity.”)
The service permits users to transfer all financial records to the app and set up an "intelligent" budget plan to meet their goals. Expenses can be categorized in a variety of different areas: leisure, transport, housing, energy, banking, trips, beauty, hotels, investment, health, etc. This means that the app has access to all the financial transactions and payment receivers, from the therapist we visit to the nursery school our children go to.
With this data, it calculates if users are managing their expenses appropriately through a machine learning algorithm called FinScore. Fintonic promotes FinScore as an "impartial" and "independent" index that will grade how 'well' we are doing with our money. When contacted by email and asked specifically about how this automated scoring system worked, Fintonic told us that decisions are a "result of a process of analysis of the information that Fintonic knows about the users based on their personal profile and their transaction history" (source: email exchange between Fintonic representative and research team).
The company has not provided further information about this. On their web page, they mention "160 variables" that the algorithm takes into account, including income level and frequency, net balance, returned receipts or credit history (if a person has ever asked for a loan, if they paid it back, etc.). If users want to engage with a bank that has partnered with Fintonic, their score will be shared with the financial entity, who will decide if a loan or another credit product is approved.
The financial scoring is supposed to refresh at the beginning of each month. At some point, users have claimed that the automatic scoring remained the same even though the balance in their account changed to a larger income and fewer outlays: "I have been using the service for several months and the FinScore score has not changed, even though it should have (...) since my expenses have varied (...) and I hardly spent money during the past months."
According to Fintonic, from our email communication with the company, the scoring that their algorithm generates about users is only transferred to other companies with the user's consent, meaning that supposedly they ask users whenever it gets shared with third parties.
Had a slip last night? Tell your app about your contraception pill
Knowing if a digital service uses algorithmic systems for any purpose is important because they don’t ‘see’ what a person can see.
Let us look into this with an example. When analysing the data that the companies sent to us, we encountered a very specific case that rang our alarms. Among one of the users’ expenses list, we found a note with the name “Pranzo.” If a person were to look into that specific set of expenses, they would quickly notice that the user was probably talking about “lunch”, because that is the translation in English for “pranzo,” an Italian word. The problem is that in Spain, “Pranzo” is known as the name of a medication used for eating disorders, like anorexia. We could infer that the user was on a trip to Italy by analysing the rest of their expenses. But, if we isolate the word from its context, knowing that the user is Spanish and not Italian and depending on the type of algorithm used, a machine could interpret that the user spent money on this medication.
Image: sample from user dataset obtained by Maldita.es and Tactical Tech through DSAR (data subject access request).
It is also important to insist on the fact that we are turning these apps into little diaries, which we assume that only our partner or our friends can read. This reaches a point where we are feeling sufficiently confident to write down the day when we have spent money on a contraception pill, without knowing who has access to such information and how they work around that data (considered sensitive under GDPR since it relates to health issues.
Image: sample from user dataset obtained by Maldita.es and Tactical Tech through DSAR. Translation of the caption in the image: “Pill because X does not want siblings”.
Sometimes, we describe expenses too plainly or we write them down as an internal joke, knowing that in an app like Splitwise, where we are sharing outlays with friends, they will understand them and maybe laugh. Still, we need to have in mind that if we tell Splitwise (or any other of the mentioned apps) that we are spending 50 EUR on “mushrooms,” on “cocaine” or on “marijuana,” this is going to stay in the app (these are real examples that we obtained in the answers to our DSAR). Registered, stored and maybe processed.
The US Government is closer to your data than your mum is
Sometimes these personal details travel from Europe-based app users all the way to the United States. Under the General Data Protection Regulation (GDPR), sending app users' data to the US or other continents is not permitted if not absolutely necessary or without permission, because companies must guard our data in a country that adjusts to the data protection laws in Europe.
You Need a Budget (YNAB) is an expense control app that allows users to classify their spendings in different categories and build a budget around them. Linking a bank account and respective transactions is an option for users that prefer an automatic sync rather than adding each transaction manually. It has very specific functions like adding deadlines to monthly bills, for example to pay for energy or internet at home before the limit date, and it even gives the option to link a credit card and monitor its operations.
YNAB is available for users in European countries including Spain, France, UK, Germany, Romania or Lithuania, but it is a US-based company (State of Utah). Let us start here with why this is important.
Why is it relevant to point out when personal data is stored and processed in the United States and not under European jurisdiction?
“There is a big issue with transfers to the US, which is the Foreign Intelligence Surveillance Act (FISA), which has been held time and time again to constrain data protection rights of EU residents. FISA allows for a disproportionate access to personal data via surveillance programs and does not allow for effective judicial redress on the side of individuals after the fact”, data protection lawyer Rahul Uttamchandani explains to us.
This means that there are more possibilities that our data is accessed and processed by third parties we are not aware of, which implies a loss of control over citizens' personal information. Data considered sensitive by the European General Data Protection Regulation (GDPR), (health records, for instance) cannot be processed in Europe for trivial purposes like advertising and, theoretically, cannot be transferred to third parties without user consent or a strict justification that is addressed in the regulation.
For example, if the U.S. Government were to ask a company like YNAB for a user's data records, they could access certain personal information if the respective user linked their bank account to the app. This way, personal data introduced in the app would be disclosed - such as email, phone number, bank account, device data (including location, IP address), or the user's age or gender if this can inferred or directly obtained from bank account data.
For some companies based in the US, the processing of personal data is desirable in order to carry out their operations. An example we see time and time again is Meta (Facebook), a firm that for the past years obtained 99% of its profits from advertising and that has alerted in several occasions that its business model depends on the data transfers to the US. In their 2021 annual report they directly state that they will "likely be unable" to keep their services running for European users if a new regulatory framework prevents them from sending data to the US, something that would "affect" their business and financial condition (see Meta Platforms Inc. Annual Reportwith the U.S. Securities and Exchange Commission, 31 December 2021, page 9).
In the case of YNAB, all users' data is processed in the US, even though we may be using the service in a European country.
We requested access to a data donor's information (through power of attorney) and directly asked YNAB what recipients they were sharing the information with. To this second part, they replied as follows:
“We disclose the personal data to the following recipients: analytics services, email service, cloud services, financial aggregation services, logging services, cloud security services, marketing services, payment processors, and customer support services.”
When answering our data request, the company also stated that while they disclose the personal data "to recipients located outside of the data subject's home country" that use "appropriate safeguards to protect the personal data," they did not offer further information on what those safeguards are or where the recipients are located.
Something similar happens with Splitwise, an app that allows users to split bills, keep track of debts and monitor expenses with others such as friends, family members, housemates, etc. who have the app account themselves. Splitwise serves individual users who need to monitor their cash flow across multiple shared expenses and groups of users who need to coordinate on joint spending / cost sharing. Unlike other apps we researched for this report, this service cannot connect to a user's bank account.
With the Splitwise app, users can create private groups to share expenses for specific activities such as parties, trips, dinners and more. It also allows people to interact, comment and plan joint costs within each group, adding a social networking feel to finances. Users can tag and locate their expenses across groups; therefore, the more detail one adds, the easier it is for the app to create a more thorough profile for individuals and their contacts in various groups. For instance, from our DSAR requests with data donors using the app, we could confirm that Splitwise indexes very detailed information about users’ relationships, locations, health status and daily activities, as well as very particular habits, some of which could influence someone’s insurance or credit profile quite substantially. Of course, users feed the app with all that information voluntarily and the extent to which they do so determines the various possible risks to their privacy in the long term.
Image: sample from user dataset obtained by Maldita.es and Tactical Tech through DSAR.
To go into detail on how much we put in there, the list of data we received reveals that someone could find out based on the list of expenses if the user has received a COVID vaccine or not. Relating a "celebration dinner" or "celebration wines" to the concept "vaccine" does the job.
In addition, within the US, Splitwise users can connect and share payments via PayPal-owned social payment app Venmo. Tactical Tech's Exposing the Invisible project featured an investigation into Venmo's issues with privacy and users' personal data exposure in this article.
What does “I agree” really mean?
As set out at the beginning, there are several matters that can be questioned from a legal perspective: the fact that financial data is being collected and shared with financial aggregators and marketing partners; that this data can be sent to countries like the US, where the European data protection laws do not apply; and that what happens with such data is not explained transparently to people.
For example, YNAB processes data from European users in the US, which means it oversteps GDPR requirements, therefore this data is no longer as protected as it would be under European jurisdiction. National data protection authorities like the French personal data regulator (Commission nationale de l'informatique et des libertés / CNIL) classify the United States as a country that "does not ensure an adequate level of data protection recognized by EU."
Image: Map of “Data protection around the world”Source: Commission nationale de l'informatique et des libertés / CNIL, https://www.cnil.fr/en/data-protection-around-the-world
In a phone interview with our team, Jelena Adamović, data privacy lawyer and researcher at SHARE Foundation also emphasizes the issues around obtaining user consent: "this practice where the reading of the privacy policies means that you have consented to anything is directly contrary to the GDPR logic and provisions.... Even more so... if you want to use consent as legal basis for data processing, then there are these very high standards for consent which means that it needs to be freely given, unambiguous and so on." (See GPDR conditions for consent, article 7)
Nevertheless, for some types of processing, companies are taking advantage of the legal basis that user consent guarantees: “Consent is an appropriate legal basis if individuals are offered a real choice and control over how the data is used. If consent is a precondition to use a service, it is most likely not the appropriate lawful basis for processing data,” Uttamchandni adds. In most cases, users are not given the chance to accept only some of the conditions, but they are forced to agree to all of them at the same time.
Users in some European countries are not fully aware of what they are giving away when they use specific digital services. A qualitative study conducted by Maldita.es revealed that citizens in Spain identify a problem with how their data is handled and acknowledge that their privacy in digital spheres is damaged, but they do not take steps to reverse the situation. Interviewees admitted to not reading the privacy documents of digital services and were not able to point out any consequences of their data being leaked or wrongly used. They see possible detrimental outcomes as something that will likely affect other people, but not themselves.
A recent Eurobarometer on digital rights awareness also proves this phenomenon: over a third (39%) of the EU population is aware that their rights must also be protected in the digital sphere, as in not being discriminated and having their privacy preserved. Nearly half (46%) of EU citizens admit they are concerned about how firms and institutions are processing and using their personal data.
When it comes to financial data, there is a direct impact on users if their personal information linked to economic transactions is mistreated:
"The worst-case scenarios I can think of are scoring that may easily lead to discrimination (decisions based on individual scoring about the granting of loans, insurance, etc.) and impersonation and/or fraud in the case of data breaches," reveals Uttamchandani when asked about the potential risk.
Your “contract” is not only with one app... look again at your conditions
As with every other digital service, we must accept some conditions that are set out in privacy policies and legal terms. By doing so, YNAB for instance alerts users that if they decide to link a financial account to the app, they are acknowledging and agreeing “to the terms of the respective privacy policies of those partners” that will facilitate this connection. Furthermore, users will “expressly grant aggregation partners the right, power, and authority to access and transmit information as reasonably necessary” to provide the services. Who are these financial aggregators?
Plaid, MX and TruePlayer. These are the companies in charge of transferring bank transaction histories to the application. They act as a pathway between the app itself and the bank and it is them that we give our banking credentials to. “At the direction of the user, YNAB receives access to transaction data (such as date, payee, amount, etc.), as well as account details (such as account name, balance, interest rate, etc.),” explains YNAB.
Plaid, for example, connects our financial accounts with fintech services like YNAB. Separately, they access a large variety of data: our name, email address, phone number, date of birth, address, bank account number, sometimes information about our employer, about loans - including due date, interest rates, payment plan - account balance and list of transactions.
These third-party services in turn collaborate with other companies. They also state in their privacy documents that they may share identifiers, financial and commercial information, professional information and location data from users with "financial institutions," "professional advisors" and "analytics service providers," which allow them to provide services to clients like YNAB. Just like YNAB and Splitwise, they operate in the US, therefore data is processed in that country.
Another one of these financial aggregators, MX, works in a similar way: "We also process personal data of clients' customers in an aggregate form to assist our clients, offer meaningful promotions such as personal loans, credit cards, and mortgages, based upon analysed information." Furthermore, information such as app users' spending habits, combined with other personal details and preferences feed into increasingly accurate user profiles, which online advertisers need in order to provide better tailored and targeted marketing campaigns meant to attract further spending.
Be anonymous... so that we can de-anonymise you in a second
Generally, these apps know how much you spend on internet, energy and water at your home. They also know how much tax you pay on a monthly basis and if you have children. And as we have seen, services transfer some data to find a name and an identity behind that information. These services are called trackers.
Apps use trackers for different purposes. Some make sure that everything is working correctly. For example, Google Crashlytics, which records what the user was doing when the app crashes, or Google Analytics for Firebase, which records different bits of data on how we use the app. Think about it as a diary where these services automatically write down every button we press or every time we open the app. All that information is then linked to a kind of virtual “identification card” (our user ID), which allows the different services to open the diary and see how we are behaving as users.
It can happen that the companies behind these tracking services are collecting a huge amount of data about users or that their privacy practices are actually not that privacy-friendly. Here is one of those cases.
Through a technical analysis of the applications conducted by our research partners at SocialTIC, we found that YNAB uses third-party trackers such as mParticle, Braze or Bugsnag. Some of them have quite an aggressive approach to how they access and collect users' data. This means that even though the budget control app we signed up to applies specific safeguards to users' data, these third-party services can process the information they obtain with other considerations (even though they are supposed to stick to contractual conditions with the respective budget control apps).
For instance, mParticle is the type of service that is behind those push notifications, marketing emails or online advertisements we see related to a specific product that we are using or we have once used. Their goal is to make users engage with a service at all times, especially when they detect that users have stopped using an app for a while.
mParticle has a particularity that should concern users: one of the services they offer to apps is IDSyinc. This service allows the developers to assign users a unique identification code (ID) that can be tracked even when they have only downloaded an app but have not logged in yet. This ID stays attached to the respective user from the moment they create an account. mParticle is basically saying that they can link anonymised or aggregated data extracted from a user that does not have an account from the moment they decide to create one. And how do they manage to match the profiles? Through aggregated data they track and that is not supposed to identify us.
This way they associate random users that a firm does not recognise to possible future clients that end up creating an account. But they do it at the expense of non-identifying information, such as device data, IP addresses (that do not identify us directly), or our activity in a particular app. This process is known as the reversal of anonymized or pseudonomized data and is contrary to a privacy-preserving approach to data.
The company acknowledges that there are "compelling business and legal arguments" against such an approach, but they downplay its importance, mentioning the "chance" for clients to "preserve a complete history of a user's experience with your app."
It also allows clients to look up specific information on a user's profile: "IDSync search allows marketers to query User Profiles by any known identifier, such as email, mobile phone, or device identity, and return all matched user identity values including the mParticle ID," the company explains in the guides they make available for developers. This way, even if a user decides to create a new email address that they have not given to a service, mParticle will associate the email they already have with the new address, thus still tracing them.
Also, apps might not share data directly with Facebook, but if they use mParticle, there might be a transfer of information anyway, given that there are options to integrate Facebook data with mParticle (see more here) and it has special functionalities to reach audiences in this platform.
The fact that data can be combined to identify us and know more about us is very relevant. For instance, from the data we obtained during our research via DSAR requests with Tricount users, we could map out a person's granular details, including an entire trip's itinerary: where they went, where they stayed, where they ate, what attractions they visited, how much they paid for food and taxis, when they used their own car, and even when they entered a public WC in the middle of the street. Adding to that the names of people they were with (based on expense details and comments within the app), later on some updates on intimate details of their couple life.
Image: sample from user dataset obtained by Maldita.es and Tactical Tech through DSAR.
All of the applications we researched use trackers and third-party services, which allow them to obtain certain information about their users beyond the declarative data we hand to the companies when using their services (name, email, phone number, bank account, etc.). But writing down our expenses on a digital service implies disclosing to strangers intimate details about our lives. Then these companies can share them with other firms, which will share them in turn with other firms. All in accordance to contractual terms users agree to with one click, but still...
They know what you smoked last week, even if you don’t put it on social media
Tricount, like Splitwise, allows people to share expenses from a trip, for example, and calculate how to divide them between the different participants. It then provides shortcuts to pay the debts through PayPal or an IBAN (a bank account).
Image: screenshot from Tricount app. Source: Maldita.es
The application asks for certain personal data when users sign up: name, email, phone number, a profile picture if desired, and an IBAN if the user wants to add a bank account to the service. This sign-up can be skipped if, instead of creating a new profile with Tricount, we opt for the social login with Facebook or Google, or with an Apple ID if using an iOS device.
The social login allows users to avoid creating an account with a new service at the expense of linking a social media profile - Facebook, in this case. Doing so provides Tricount with instant access to a subset of the data that a user has stored in Facebook. The extent of that data depends on the company's preferences but also on users' privacy settings, and can range from the email and profile picture to the list of friends, posts, pages we like, age, gender, birthday, location, hometown or photos uploaded.
This function works through a package of tools that Meta (Facebook) makes available to developers and third parties, the Facebook Software Development Kit (SDK). It includes a series of plug-ins and trackers that publishers, commercial partners or advertisers can use in order to integrate their app with Facebook and obtain data about how users are engaging with the platform and the service, plus upgrading their advertising strategies.
Through this functionality, a fintech application developer could tell Facebook what users do inside the app. If the application, in this case Tricount, was interested in us using a particular characteristic of the service that requires a payment (Tricount also has a premium service), ads related to it could be displayed in the social network. (For a basic read on how ad targeting works across platforms and what you can do about it, read this article on how to "Renovate Your Social Media Profile.")
Asked about this functionality, Facebook explained that they are in the process of restricting developers' data access even further to help prevent abuse. Among their measures, they say they will remove developers' access to Facebook and Instagram data if the user hasn't used the app in three months and that they will restrict the initial data a developer has access to. Facebook, as the user identity provider to Tricount, can obtain a lot of information about each user's behaviour within the app. It knows what they are accessing at any given moment, from which device, etc. This allows it to get to know users better, including tastes, habits, interests, schedules.
Asked about what particular information it could receive from a fintech application, Meta directs us to their help desk web page, where they explain that they "forbid" data providers to share financial data with them. Among this data, they specify bank account or credit card numbers, income, credit ratings or bank balance. They do not mention categories of expenses or spending habits.
Alongside Google, Facebook and Amazon, Tricount has a clearly stated partnership with the Branch tracker, whose primary function is to link data between different platforms and devices, in what is called 'deep linking'. Branch aims to help services maximise user engagement and performance. Based on public documentation we can infer it tracks a user by setting a user ID and linking it to certain events / actions, the more relevant ones in our case being: when a user buys something; when a user interacts with any feature of the app; when a user progresses in the usage of the app (creating a profile, linking an email, etc.); and other custom events and actions defined by the app developer. Branch has the possibility to record events recorded by Google Tag Manager and Google Firebase also used by Tricount services. By this we can safely assume that it can access most of the info Google trackers collect from a user. Practically, this means more accurate user targeting with ads and other information that can appeal to specific preferences, needs and ultimately more spending.
Remember what we said earlier about how we were disclosing our expenses? Think of it as keeping a diary of all the alcohol you have purchased in the last week, and the previous one, and the previous one. If someone saw it, would they consider that you drink too much, for example? There are services whose main objective is inferring that from what they know about you.
Deleting your account does not mean deleting your data
It is not a trivial matter when we say that, as users, we are hardly aware of how our data is being handled. It becomes even more important when we talk about financial data, which means we enter into people’s private sphere and how they spend their money. We were especially hit by this acknowledgement when we found out that one of the companies we sent data requests to provided us a copy of the personal data they kept... about another user.
In this particular case, we requested data (via DSAR) from one of the apps of a user who had left the service a few months ago. The company informed us that because this person had deleted the account, they were no longer storing financial data. They did keep, nonetheless, a file with some personal data, like email address, IP address, location, time of use, device data and also the budgeting functions the user had activated. For example, if they were planning a budget for a vacation or for retirement.
All of this information - but from a different user - was delivered to us in response to our DSAR request. The firm only noticed the mistake when we became confused and pointed out incompatibilities with the situation of our actual data donor. It turned out that, just as the person who lent us their information voluntarily for this investigation, the victim of the breach had also deleted their account. Not in 2021 as our data donor did, but in 2019, when they closed their account after a month of free-trial use.
Image: sample from user dataset we obtained from the app company about “wrong user”. Personal data censored.
Image: sample from user dataset obtained from the app company about “wrong user”.
This is something that has to be taken into account from a security perspective: in case the firm suffers a data breach, the data of a person who has not used the service for three whole years would also be displayed to the attackers. With such a list of personal data, cyber delinquents can impersonate their identity, commit fraud with the information or sell it to data brokers, among other things.
In our case, the company did inform us as soon as this mistake was detected that the affected user has been alerted that “limited personal data” had been disclosed to third parties.
Note: the name of the company is not disclosed since we were not able to verify if the breach happened due to a procedural, technical or human error.
Privacy policies are so private they hide the most important parts
Data protection lawyer Rahul Uttamchandani spots at least two infringements of GDPR: "Failing to provide the necessary information (transparency obligation) and failing to comply with the principles of processing (processing without lawful basis)."
Third-party agents, service providers, affiliates and subsidiaries of the app's company performing different functions - such as "maintenance services, database management, cloud hosting, web and mobile analytics, receipt scanning services and OCR" - receive access to users' personal information for specific purposes, and are "contractually obligated to abide" by the app's privacy practices.
When asked if the firm used any automated decision system on users’ data, Splitwise answered to us via email: “Splitwise does not make use of any automated decisions that create legal effects or similarly significant effects on our customers, including profiling.”
It was definitely a “bumpy ride”…
The path to obtaining all this information was a bumpy road. In order to ask companies what kind of data they were keeping about users and how they were handling it, we used Data Subject Access Requests (DSAR), a legal instrument / procedure that is provided under GDPR and applies to all EU citizens, meaning that any person can use it to request access to their own data collected and stored by various institutions (as in companies, services, public institutions, etc.). In practice, this is not so easy. Sometimes companies do not fully respect the rules when it comes to how these requests can be done and as a matter of course they do not respond with everything they should.
There are plenty of things that a DSAR lets us know. When we exercise our right of access, we are entitled to obtain a copy of our personal data from an institution (also called a "data controller"), but also a series of features that relate to us as users. For example, it provides us with the right to know what companies and services a data controller shares our data with and for what purpose; if a company has obtained data about us from third parties; or if there has been any automated decision made about us, which can affect us in any sphere of our lives.
This particular case is very interesting in the fintech ecosystem, as we stated previously in this article. A profile based on our expenses, our daily habits and the amount of money we keep in our bank accounts is a juicy product for financial aggregators whose job consists in analysing whether our economic profile meets the necessary requirements to be given a loan, or whether our insurance plan can be approved.
"The worst-case scenarios I can think of are scoring that may easily lead to discrimination (decisions based on individual scoring about the granting of loans, insurance, etc.) and impersonation and/or fraud in the case of data breaches" - Rahul Uttamchandani, data protection specialist.
What often happens when exercising the right of access is that companies provide a copy of the data they keep associated to a user, but leave the rest of the inquiries unattended. This means the law provides us with a way to directly ask for these matters to be responded, but companies are simply not complying with it, ignoring certain parts of the requests. The case with the apps we analysed did not vary that much and some of them were hesitant to fulfill the requests.
Firstly, you must know that as a citizen of the EU you can ask another individual or an organization to send a DSAR on your behalf. You do so by giving them a power of attorney (via written documentation and agreement) to represent, submit and receive your request. This means that you could ask Maldita.es or Tactical Tech to inquire a company about the data they have about you. It is exactly what the users who donated their data for us to carry out this investigation did.
Nevertheless, at least one of the companies behind the apps was not in favour of complying with the right to representation that is granted under the GDPR. Fintonic did not want to send us the data they had gathered from our data donors unless they were the ones submitting the direct request to the company. We ultimately did ask our data donors to submit these requests, leading to us receiving the data from Fintonic much later than we should have based on our initial request using the right to representation. This is something that could easily be reported to a data protection authority in a European country because it does not comply with GDPR rules.
Sometimes firms know this, and use it in their favour, as Ángela Álvarez, member of the Spanish firm specialized in DSAR MyDataMood recalls:
"Companies comply poorly with the right of access. Most of them delay their answers and inform about some of the data they have about clients, but in most cases they do not actually provide all the information they should about them, and they hand in only the assertive data that the proper users have facilitated, like their name or their email address... Rarely they also transfer the private data of usage that the company uses to provide the service." Álvarez states. "Other issues, like the logic that profiling algorithms follow, are practically never displayed by the firm."
Fintonic did give us a file with little personal data about our data donors once we recalled our right to send a DSAR on behalf of an individual who has given permission for it. After insisting on the fact that there was a lot of user data missing, they denied once again the request alluding to “security reasons” and mentioning “users’ privacy.” We only managed to access the data after asking the data donors to contact Fintonic themselves and authorising the transfer of data to us. Still, they did not provide further explanation to an important request on how automated decisions where taken about users’ profiles through FinScore, their grading algorithm.
Plainly, there is a contradiction between the reasons the firm argued to deny our request and the legal requirements to do so. Apparently, personal data like names, phone numbers or IP addresses could be handed in, but not the user data linked to the service. In this case: financial information that is being compiled; who they share it with; or the criteria they use to assign a credit score to users. To restate, this is information that the GDPR provisions would enable a user to find out, and a refusal to provide it could be claimed to the Spanish Data Protection Agency (AEPD).
As for the rest of the applications, none of them except YNAB handed in the additional information specified in the request before we explicitly asked them again about it.
What do we do from here onward?
The obstacles we encountered and the findings that emerged showed us the extent to which the right of information access can be challenging, confusing and sometimes unreachable for users that do not have expertise in digital rights or the time to conduct deep research on how to exercise them. There are a few important aspects to keep in mind when deciding whether to give our trust and hand over our data to fintech apps such as the ones we analysed in this report. Of course, these points can be generally valid for any other apps, we cannot repeat this enough.
This does not mean that using these apps is wrong. On the contrary, they can be very helpful to better keep track of our personal finances but it is important to know that there are trade-offs involved. We gain a bit while we give a bit (or more).
Before using a service, we must know what is at stake. For example, users’ data is not only going to the particular company that provides the app service but to a long list of third-party services and companies that can access and use personal data and user statistics for different purposes. At this level however, user awareness can only be achieved if we know how these services work, and this is not an easy task.
Reading privacy policies, for example, and questioning things we do not understand is the very first step. For example, what is that service or company that I have never heard of, which can access and process data from my budget app? What is that tracker that my app uses in order to analyse my spending habits and place ads in my app? Why is that needed and how does it actually work?
When we start using an app - financial or any other kind - we must remember that we commit to a series of interactions and conditions as soon as we open it and make our first attempt to use it. We often “agree” to terms of service and to privacy policies as soon as we get prompted to do so, we rarely dig further. Alternatively, in some cases, we cannot even use the app or the key functions we signed up for unless we “agree.” Sometimes we are not even aware of agreeing because we just want to use it, but we still need to be conscious about how it works.
Learning how to send a DSAR - the user data request process we have described in this article - is one of the methods to achieve this. But as we have seen, there is a long way to go.
Additional documents and publications
Investigation report and methodology in Spanish (pdf.): "Tu cartera ya no está en tu bolsillo, sino en tu móvil: ¿qué saben de ti las aplicaciones de control de gastos?", Maldita.es, March 2022.
Spanish-language reporting on the investigation by Maldita.es, in three parts: part 1; part 2; part 3.
Technical Report on Tricount, Splitwise and YNAB (pdf.), by Paul Aguilar and Diego Morábito, SocialTIC, February 2022.
Credits and License
Research and reporting by:
- Naiara Bellio López- Molina and the team from Maldita Tecnología
- Laura Ranca and the Exposing the Invisible team from Tactical Tech
- Paul Aguilar and Diego Morábito from SocialTIC
Editorial supervision from Tactical Tech: Marek Tuszynski
English editing: Christy Lange
Graphic design: Yiorgos Bagakis
Advice and input on GDPR matters: Jelena Adamović and Danilo Krivokapic from SHARE Foundation; Rahul Uttamchandani.
The report is licensed under a Creative Commons Attribution-ShareAlike 4.0 International license / CC BY-SA 4.0
The production of this investigative story was supported by a grant from the Investigative Journalism for Europe (IJ4EU) fund.