23 August 2018 Ken Foxe, www.righttoknow.ie Re: FOI/2018/204b Dear Mr Foxe, I refer to the below request which you made under the Freedom of Information Act 2014 (the FOI Act) for the following records held by this body: “- copies of minutes or notes taken at the meeting between senior officials from the Dept and Facebook in Dublin (immediately after the NY meeting). - copies of any presentations/briefing materials given or prepared for the meeting. - copies of any correspondence between the Department and Facebook with a date range beginning July 17 to date of receipt of this request [20 August 2018].” I have now made a final decision to part-grant your request on 23 August 2018. The purpose of this letter is to explain that decision. This explanation has the following parts: 1. A schedule of all of the records covered by your request; 2. An explanation of the relevant findings concerning the records to which access is denied, and, 3. A statement of how you can appeal this decision should you wish to do so. This letter addresses each of these three parts in turn. 29-31 Bóthar Adelaide, Baile Átha Cliath, D02 X285 29-31 Adelaide Road, Dublin, D02 X285 T +353 1 678 2000 1890 44 99 00 www.dccae.gov.ie 1. Schedule of Records A schedule is enclosed with this letter, which shows the documents that I have identified as relevant to your request. It describes each document and refers to the sections of the FOI Act which exempt them from release, where applicable. The schedule also refers you to sections of the detailed explanation given under heading 2 below, which are relevant to the document in question. It also gives you a summary and overview of the decision as a whole. 2. Findings Access to records 1, 3-7, 9 and 10 are granted. Access to records 2 and 8 are part-granted. Personal information within the meaning of s. 37(1) of the FOI Act has been redacted. 3. Rights of Appeal to the Office of the Information Commissioner In the event that you are unhappy with this decision you may appeal. In the event that you need to make such an appeal, you can do so by writing to the Freedom of Information Unit, Department of Communications, Energy and Natural Resources, Elm House, Earlsvale Rd, Cavan, Co. Cavan or by e-mail to FOI.UNIT@dcenr.gov.ie. Your correspondence should include a fee of €30 for processing the appeal. Payment should be made by way of bank draft, money order, postal order or personal cheque made payable to the Department of Communications, Energy & Natural Resources. If you wish to make payment by electronic means, please contact the FOI Unit directly. You should make your appeal within 4 weeks from the date of this notification, where a day is defined as a working day excluding, the weekend and public holidays. However, the making of a late appeal may be permitted in appropriate circumstances. The appeal will involve a complete reconsideration of the matter by a more senior member of the staff of this body. ….. 2 Should you have any questions or concerns regarding the above, please contact me by telephone on 01 678 2383 or by email at Ciaran.Shanley@dccae.gov.ie. Ciarán Shanley FOI Decision Maker Broadcasting & Media Division ….. 3 FOI Request Reference: FOI/2018/204b Schedule of Records: Summary of Decision Making Description of request: “- copies of minutes or notes taken at the meeting between senior officials from the Dept and Facebook in Dublin (immediately after the NY meeting). - copies of any presentations/briefing materials given or prepared for the meeting. - copies of any correspondence between the Department and Facebook with a date range beginning July 17 to date of receipt of this request [20 August 2018].” Record No. Brief Description & Date of Record File Ref. No. of Pages Relevant facts Findings/ conclusions 1. Note_Minister_Face book_Meeting_16.4. 18 2 Grant 2. Email_C4_Dispatches _12.7.18 1 Part-Grant 3. Dispatches_Note_FB 5 Grant 4. Email_C4_Dispatches _17.7.18 1 Grant 5. Note_Meeting_Minis ter_Facebook_19.7.1 2 Grant (Public Interest Considerations, (If applicable) Grant/Refuse/ Part-Grant Basis of Refusal: Section of Act Record edited/Identify deletions s. 37(1) Personal information redacted 8 6. Note_Meeting_Face book_23.7.18 2 Grant 7. Email_Re_FOI_30.7.1 8 2 Grant 8. Email_Re_JOC_31.7. 18 2 Part-Grant 9. FB_Opening Statement_JOC_31.7 .18 10 Grant 10. Email_Re_FOI_7.8.18 2 Grant s. 37(1) Personal information redacted Meeting of Facebook with Minister Naughten 16 April 2018 Attendance: Minister Naughten; Suzie Coogan, Press Adviser; Jill Mellor, Special Adviser; Patricia Cronin, Assistant Secretary; Tríona Quill, Principal Officer; Facebook: Joel Kaplan, VP Global Public Policy; Niamh Sweeney, Head of Public Policy Ireland; Yvonne Cunnane, Head of DP and Associate General Counsel; Richard Allan, VP, Public Policy EMEA; and Clare Rush, Legal Counsel. Facebook advised that they had requested the meeting to brief Minister Naughten in advance of their attendance at the Joint Oireachtas Committee on Communications, Climate Action and Environment the following day. Cambridge Analytica Minister Naughten asked a number of questions, including: • why Facebook had not acted until 2014, given that the Data Protection Commissioner had flagged the issue in 2011 and again in a follow up audit in 2012. • why Facebook had not contacted individuals affected when they realised in 2015 that the information was still being accessed in contravention of their own terms of service? Why had they not advised the DPC of this? • why Facebook had not checked if breaches had happened in the case of other apps. Could there be other cases? Facebook responses centred around the fact that the data controller in such cases is the app developer rather than Facebook. They had had ongoing engagement with the DPC’s office on the issues raised in 2011 and 2012, so it was not the case that they had taken no action. However they recognised they did not go far enough in addressing the matter and their approach has now changed. Facebook agreed that they should have advised the DPC in 2015 of continuing breaches and should have conducted an audit to ensure the data was in fact deleted as they had been advised. They confirmed that under GDPR, users will need to give explicit consent in relation to sensitive data such as religious or political affiliation or sexual orientation; biometric data and online behavioural tool settings. However, app developers will not need to demonstrate to Facebook that they are GDPR compliant. PMB on Online Advertising and Social Media (Transparency) Bill Facebook had met with Deputy Lawless and were supportive of aspects of the Bill in relation to transparency. However, they had identified some deficiencies. From April 25th, Ireland was being added to the testing phase for implementing transparency of political advertising. This will roll out elsewhere from June. It will enable users to click on an advert and see what other adverts are being run by the same organisation. The tool will ultimately also identify the identity of the advertiser. Political issue advertisements are trickier than campaign adverts, but will be included within scope. Facebook also raised an issue in relation to Irish Facebook pages being turned into adverts outside the country and targeted back at the Irish population. Minister Naughten advised that SIPO was the appropriate authority in such cases. From: To: Cc: Subject: Date: Attachments: Dualta OBroin Ciara Swaine Ciaran Shanley FW: Follow-up re Channel 4 Dispatches 25 July 2018 17:21:19 Dispatches FB.docx image002.png Re FOI_2018_180   #2   From: Niamh Sweeney Sent: 12 July 2018 17:58 To: Dualta OBroin Subject: Follow-up re Channel 4 Dispatches   Dualta,   Further to our conversation yesterday, please see attached a document that sets out some of what I described - and Facebook’s response.   Perhaps we could follow up again with a call at some point tomorrow? As it happens, I will also be in New York on Friday of next week, July 20, working from our office there, if that suited for follow up with the Minister.   Best,   Niamh   Niamh Sweeney   Head of Public Policy, Ireland  Facebook   4 Grand Canal Square, Dublin 2 The Dispatches team at Channel 4 recently brought some extremely important issues to our attention. It is clear from the information they have sent us about a programme that will broadcast this Tuesday, July 17, that some of the instructions given by trainers at our outsourced content review centre, as well as comments they made, do not reflect Facebook's policies or values and fell short of the high standards we expect. This content review centre is run by our outsourcing partner Cpl. Content review is a core function of what our business does. Keeping people safe and secure on our platform is of the utmost importance, so I wanted to let you know how we’re dealing with the issues raised by the Dispatches team, and how we think about content issues more generally. On foot of the information we have received from Dispatches, we have taken some immediate action, including: A review of training practices across our content review teams: • • • Refresher training by Facebook employees for all content review trainers working at the Dublin site (this has started); A review of staffing to ensure that anyone who has acted in a manner inconsistent with our company and content policies and values no longer works to review content on our platform; and Updated training materials for all reviewers — clarifying our policies in all the areas where wrong decisions regarding removal of the content were surfaced by Dispatches. In addition, in relation to the content where mistakes were clearly made, we've gone back and taken the correct action. Facebook's Approach to Enforcement of our Community Standards We want Facebook to be a place where people can share freely and debate difficult, even controversial, issues. But people will only share if they feel safe. And that's why we have clear standards about what's acceptable on Facebook — standards which have been publicly available for many years. In April, for the first time, we also published the internal guidelines used by our review teams — you can read them here [https://www.facebook.com/communitystandards/]. Around 1.4 billion people use Facebook every day from all around the world. They post in dozens of different languages: everything from photos and status updates to live videos. Deciding what stays up and what comes down involves hard judgment calls on complex issues — from bullying and hate speech to terrorism and war crimes. It's why we developed our Community Standards with input from outside experts — including academics, NGOs and lawyers from around the world. Cpl is one of several companies Facebook uses to help us manage and review content on our platform. We have worked with them since 2010, and they currently have 650 people working on these issues, including 14 trainers. Cpl employees take action on reports, and escalate decisions where necessary to full-time Facebook staff with deep subject matter and country expertise. For specific, highly problematic types of content such as child abuse, the final decisions are made by Facebook employees on our Community Operations teams. This work can involve looking at very disturbing content. It's why all our employees working to review Facebook content are offered psychological help to ensure their wellbeing. The programme — which was put together with input from psychologists — includes: • • • • • Wellbeing training and pre- and post- support for training modules; Peer supervision, and support for those reviewing certain content; Enhanced training on anxiety awareness, trauma, stress management, and personal resilience; Access to private healthcare from the first day of employment; and A 24/7 health support model, with practitioners on site. In addition, Facebook's content policy team — which devises our Community Standards and internal guidelines — speaks regularly to the teams working on content review so that they can answer any questions people may have. When a policy is updated, we retrain the reviewers so that they understand the specific changes. Transparency and accountability While we strongly agree with some of the criticisms levelled by Dispatches, we believe that Facebook is transparent about the details of our Community Standards, and how we enforce them. We've published these standards and the internal guidelines used by our review teams and consulted more than one hundred organisations and experts about the substance and wording of the standards to ensure that they are comprehensive and readable. In May we also held a full day briefing session on our policies and enforcement with academics and legal experts in Oxford (as well as Paris and Berlin) to get their feedback and input. And, we have allowed journalists to sit in on our Content Standards Forum, the bi-weekly meeting where we discuss and adopt changes to our Community Standards. Resourcing and review backlogs The backlogs in content review queues experienced during the Dispatches filming occurred on select days in March and April. It was cleared entirely by 6 April because additional reviewers were hired, and reports were sent to our other review locations with the additional capacity to handle them. It is important to stress that this backlog did not include reports of suicidal content. Reports about suicide are considered “High Priority,” and almost 100% of High Priority reports during those two months were reviewed within the set timeframe. We understand that the timely review of reports is essential to keeping people safe on Facebook. This is why we're doubling the number of people working on our safety and security teams this year to 20,000. This includes over 7,500 content reviewers. We are also investing heavily in new technology to help deal with problematic content on Facebook more effectively. For example, we now automatically route reports to reviewers with the right expertise and cut out duplicate reports, so that if 100 people report the same piece of content, we don't have 100 people reviewing it. This technology can also help us detect and remove content such as terrorist propaganda and images of sexual abuse of children before it has even been reported. In the first quarter of 2018, for example, 99.5% of terrorist content that was removed from the site was identified by our technology without anyone needing to flag it. This also means that fewer reviewers have to see distressing content as it's removed automatically. Content containing violent and disturbing images Videos of minors fighting can be very upsetting. That's why we remove bullying videos that are reported to us unless they are shared to condemn the behaviour, and even those shared to condemn are restricted to adults only and include an interstitial warning screen letting people know the content may be disturbing. They must then click through the warning if they still want to see the content. We will always remove such content, regardless of the context in which it is shared, when the minor or their guardian has requested it. Content depicting child abuse We always remove non-sexual child abuse imagery except where the sharing of the image may lead to the rescue of a child in danger. This is on the advice of outside experts who have advised that leaving such content up can save children from abuse. In the relatively few number of cases where this disturbing content remains on the platform, we apply a warning screen and limit its distribution. As soon as we have confirmed that the child has been identified and rescued, we remove it. If we're able to identify the country from which this abusive content came, we will report it to law enforcement. We take a zero-tolerance approach to child sexual abuse imagery. Whether detected by our technology or reported to us, we remove it as soon as we find it. Any apparent child sexual exploitation content is reported to the National Center for Missing and Exploited Children (“NCMEC”: http://www.missingkids.com/home). We also use technology to prevent this content being uploaded onto Facebook again. Minors We do not allow people under 13 to have a Facebook account. If a Facebook user is reported to us as being under 13, a reviewer will look at the content on their profile (text and photos) to try to ascertain their age. If they believe the person is under 13, the account will be put on a hold. This means they cannot use Facebook until they provide proof of their age. We are investigating why any of our reviewers or trainers would have suggested otherwise. It is right, however, that the reviewer would not take action on the account, because it was not reported as being under 13. Our contractors do not proactively put a hold on accounts whom they suspect may be under 13. Suicide and self-injury We take the issue of self-harm extremely seriously and have developed our policies with significant input from experts, such as Save.org, National Suicide Prevention Lifeline, and Forefront. Facebook does not allow content that promotes or encourages self-harm or self-injury, or in the case of Live video, a credible or successful attempt at suicide. However, we do allow content posted by people who contemplate or admit to engaging in self harm or self-injury. This is because sharing these thoughts can be therapeutic as well as a “cry for help.” As friends and family are already connected on Facebook, our platform is uniquely positioned to help people in distress make contact with a loved one. If anyone posts self-harm or self-injury-related content – regardless of whether the content is removed for violating our policies — we send them information about where to get help. And people watching a live video have the option to reach out to the person directly or report the video to us. Finally, since November of 2017, we've been using artificial intelligence to identify people expressing thoughts of suicide. We can then reach out to offer help or work with first responders, which we’ve now done in over a thousand cases. Facebook’s business model It has been suggested that turning a blind eye to problematic content is in our interests – we strongly disagree with this. Creating a safe environment where people from all over the world can share and connect is core to our business model. If our services aren't safe, people won't share and over time would stop using them. Nor do advertisers want their brands associated with disturbing or problematic content — and advertising is Facebook's main source of revenue. It's why we have very clear Community Standards and we've invested billions of dollars in working to enforce them effectively — using both people and improved technology like artificial intelligence, computer vision and machine learning. Local jurisdictions When something on Facebook is reported to us as violating local law, but which doesn't break our Community Standards, we review it carefully and may restrict it in the country where it is alleged to be illegal. You can see these content removal requests in our biannual Transparency Report. Hate speech Some statements that were captured on camera by the Dispatches team relating to how we treat hate speech are untrue and at odds with our values. It's why we're reviewing what happened here to see what went wrong and prevent it happening again. Hate speech is never acceptable on Facebook, and we're increasingly using technology to proactively detect it — without anyone needing to report it. In fact, of the 2.5 million pieces of hate speech we removed from Facebook in the first three months of 2018, 38% was flagged by our technology. Examples of dehumanising hate speech that were discussed on camera and treated incorrectly have since been deleted, and, where appropriate, we are using image-matching software to prevent it being uploaded again. Shield Pages and the ‘Britain First’ Page We removed Britain First from Facebook in March because their Pages repeatedly violated our Community Standards. Britain First was a registered political party in the UK and, until recently, had stood in elections. That's why we took great care dealing with this issue; we are reluctant to censor political content or interfere in national debates. We remove content from Facebook, no matter who posts it, when it breaks our Community Standards. However, we do have a process to allow for a second look at certain Pages, Profiles, or pieces of content to make sure we've correctly applied our policies. While this process was previously referred to as “shield,” or shielded review, we changed the name to “Cross Check” in May to more accurately reflect the process. We use Cross Check when reviewing high profile, regularly-visited Pages or pieces of content on Facebook to help prevent them from being erroneously removed. We may Cross Check posts of celebrities, governments, news organisations or Pages where we have made mistakes in the past based on the type of content posted. For example, we Cross Checked decisions on an American civil rights activist's account when content was reported to avoid deleting posts where he was highlighting hate speech he'd been subjected to. However, if someone posts something that breaks our Standard and their page was Cross-Checked, we would still remove that content from Facebook after double-checking this was the correct decision. This is what we did with the Britain First Page in March of this year. We are grateful to Dispatches for bringing these incredibly important issues to our attention. Some of the behaviour they have captured for their programme is neither consistent with Facebook's policies or our values. It's why we're investigating what happened and have already taken action to prevent it happening again. Finally, we have also spoken to Channel 4 on the record, and provided them with an extensive interview, which I hope will be understood as a sign of our willingness to engage on this and to address the problems they have highlighted for us. From: To: Cc: Subject: Date: Attachments: Dualta OBroin Ciara Swaine Ciaran Shanley FW: Follow-up re Channel 4 Dispatches 25 July 2018 17:20:53 image002.png Re FOI_2018_180   #1   From: Niamh Sweeney Sent: 17 July 2018 16:29 To: Dualta OBroin Subject: Re: Follow-up re Channel 4 Dispatches   Hi Dualta,   I wanted to share some additional information, including a link to a full transcript of our interview with Channel 4 and our written response to the Dispatches team, which includes some additional detail about the issues they have raised with us. You can find it here.   Please do let me know if you have any questions or if it would be helpful to arrange a meeting to discuss this.   Best,   Niamh     Niamh Sweeney   Head of Public Policy, Ireland  Facebook   4 Grand Canal Square, Dublin 2      Minute of Meeting with Facebook Representatives Fitzpatrick Hotel, Grand Central, Boardroom 19th July 2018 at 4pm Local Time Attendees: Video link: Minutes: Minister Denis Naughten Brian Carroll, Assistant Secretary Jean Andrews, Special Advisor Suzanne Coogan, Press Advisor Niamh Sweeney, Facebook Siobhán Cummiskey, Facebook Gareth Lambe, Facebook Seána McGearty, Private Secretary to Minister Minister: I am appalled at the reports I have re. the Dispatches programme. Especially as an advocate of child protection and as someone who has defended Facebook publicly on a number of occasions. This is the second time that Niamh Sweeney has come in front of me to apologise. There needs to be a significant step change on Facebook’s approach to this issue. Gareth: Two things; I am disappointed in the failings in moderation. This is a difficult and complex space. Even before the programme issued we organised training refreshers and a full investigation into the issues highlighted. Secondly, we are trying to create a safe environment for people across the world, not turning a blind eye. If our services aren’t safe, advertisers and users wont use our platform. Content is not ignored. This is very serious, nothing good about this situation. Niamh: The difficulty Gareth is that we are saying that and someone on camera is showing something different. We are all appalled. At this point I would like to introduce Siobhán as head of content management. Siobhán is particularly frustrated. What was captured on camera, used training materials never seen before by Facebook. Minister: the two failings I want addressed are, the content was flagged with Facebook as inappropriate and wasn’t taken down and second, this content was then used for training purposes. This sort of content shouldn’t go up on the platform in the 1st instance, the system broke down in this occasion. If CPL aren’t doing well then why are they still contracted to Facebook? This should not and cant be tolerated on any level. Re-training and reviews are not enough. Tell me the significant measures going to be introduced because of this Dispatches programme. Siobhán: I would like to reiterate that we are very disappointed and concerned and these are not our values. We create policy on protecting people from harmful persons/organisations. Child abuse example – we take very seriously. We use ‘Photo DNA’ from Microsoft to make sure child imagery (known) does not reappear. New images are banked and reported to NCMEC, where they determine the approach. Vast amount of cases it is removed and banked so it wont reappear. We have to keep updating the software. Child violence is left up on the platform where the child is in danger and the footage could be used to determine location or bring the child to safety. The footage used in the Dispatches programme – a neighbour of the child seen the footage on the platform, reported to authorities and the child was saved. There is a very narrow scope of when the image/footage is left up. In some it cant be sent to law enforcement as doesn’t show where video was taken or has no details for authorities to conduct a search. My team has spent the week dealing with this item and systematically going through videos to ensure this doesn’t happen again. Niamh: we have brought in the people shown on the dispatches programme for retraining. We don’t know how long this has been happening for or how many people it affects. The person doing the recording for Dispatches was in for 7 days and got 1 hour of video only. Did we take the eye off the ball? Siobhán: Retraining done before programme aired. CPL and other outsources contractors meet Facebook every week to answer specific questions. Reviewing content – it can be marked as disturbing/pass to manager/pass to content team etc. specific steps for each member of content team. Quality team do internal checks weekly. We are going to work even closer with CPL. Niamh: 7,500 content reviewers, it is difficult to train after doubling size of team in 12 months. “Its all about making money” was a comment in the dispatches programme – that is not what Facebook is about. Advertisers do not want content like that associated with them so makes no business sense to allow content like that. Flippant comments during the training are the most disturbing. Suzie: Not through the programme… Minister: This was used as training tool! Should have never happened. Make sure never a repeat of this. Colleagues across the EU are going to ask me about this. What significant changes are Facebook going to make so this never happens again? What do you say to the NGOs? Disturbing content being flagged and not being addressed - a lot of criticism of Facebook’s failure to act. How do you intend to reassure the public going forward? If content is flagged, why is it not actioned on. Niamh: 2 issues – this wasn’t a moment of our greatness but this isn’t going to dismantle Facebook and CPL. Test results on our content are good. Siobhán: We are monitored by EU Commission and we get reviewed twice a year. 79% of hate speech removed, highest result within 24hrs 89% removal of hate speech. This was a mistake but we are at 98% with AI. 38% removed before even reviewed. Transparency report in April. AI is where we are going. Where you have humans, you have human error. We are going through all retraining and have changed all training materials. Gareth: we need to manage expectations; 1, we cant guarantee this wont happen again and 2 unlikely that we can monitor hate speech to satisfy all as it can be subjective. Minister: difficulty is, this was being taught to moderators. What extent – how long did this go on for? Suzie: Who decides what is acceptable content? Bullying towards citizens of Ireland – who decides what is acceptable? Niamh: In order for transparency our standards are all available on the web. We are a lot more transparent than every before because of 2 months ago. There is a definite difference in the way Private individuals and public figures are treated online. Not every complaint about content is in fact grievous content. Siobhán: teams based all around the world – bullying policy very expansive. You cant laugh at, mock by name etc. If it is reported it will be taken down. Protection of minors very important to us. Cant stress that enough. If they or their guardians report it, then it will be removed. Jean: what proportion of turnover invested in these AI technologies? Gareth: Mark Zuckerberg increasing moderators from 7,500 to 22,000 which is a significant investment Niamh: we are hiring an additional 10,000 including content reviewers. 50% increase globally working on security and reviewing content. Most profits are put back into the business. 3000 working in Ireland – the fastest growing of those teams are the safety teams. Minister: There needs to be a statement from Facebook. . Niamh: will turn up and play our part in any Oireachtas meetings etc. I hear you loud and clear. We published the letter to Channel 4 addressing all the matters raised in the programme. We are not being silent on this and we were ready this time. I am sorry again and I thank you for taking the time to meet with us. Minister: I feel I have been a child advocate for so many years and I’m embarrassed about the reports I got. Note of Meeting between Representatives of Facebook and the Department of Communications, Climate Action and Environment 23 Iúil 2018, Adelaide Road In attendance: Facebook Gareth Lambe – Head of Facebook Ireland Siobhán Cummiskey – Head of Content Policy EMEA Niamh Sweeney – Head of Public Policy Ireland (By Phone) DCCAE Patricia Cronin – Assistant Secretary Tríona Quill – Head of Broadcasting and Media Division Dualta Ó Broin – Broadcasting and Media Division Summary: - Facebook (FB) opened the meeting by saying that they were very disappointed and disturbed with what the C4 programme had revealed and that they were investigating how the failings in content moderation had occurred. They also said that they wanted to be very clear that it was not in FB’s interest to carry extreme content and that it was especially not in their advertisers’ interest to be associated with extreme content. - Officials asked FB to give an overview of the actions they were going to take, and what detail the approach they would be taking to rectify the issues raised. - FB gave a detailed account of the failings that had occurred, and the actions which should have been taken. - Officials queried the relationship which FB has with An Garda Síochána and Local Law Enforcement globally, and FB outlined the process which the company follows in respect of illegal content. - Officials queried what the timelines for illegal or harmful content removal was. FB responded that it was difficult to give a precise time frame for all cases, but that most notifications were responded to within 24 hours, and in the case of high-priority cases such as Child Sexual - - - - - ENDS Abuse Material or Suicide/Self-harm – the response times were much faster than that. FB gave an overview of the image recognition technology which the company has in place, and the improvements which are planned to that technology. FB outlined that its review of the issues has two strands – one in relation to the content, and the other in relation to the processes. An investigation has been launched which will be carried out by FB’s own investigation team – rather than a CPL team. In addition FB said that there were three key actions which the company would be taking. Firstly in addition to retraining their agency staff they would be to bring them closer to the FB content moderation team. They recognised that there was a failure to explain the rationale behind the Community Standards to agency staff and that is something that they would be working on. Secondly, they would be introducing new features to their services which will allow users to appeal against decisions which had been taken by their moderation teams, in certain circumstances, which would then be reviewed by a separate team within FB. Finally, they would be reviewing their Child Sexual Abuse Policy, in consultation with a range of international stakeholders, to ensure that the measures which the company had in place were effective Officials referred to the establishment of the National Advisory Council on Online Safety, and FB confirmed they would agree to participate in the Council and to engage in that process fully. Officials gave an outline of the various approaches to regulation which were being discussed, including the Digital Safety Commissioner Private Members’ Bill and the implementation of the revision to the Audiovisual Media Services Directive and said that DCCAE’s priority was to ensure that the regulatory system put in place was effective. DCCAE will require FB to cooperate fully in the stakeholder engagement aspect of these processes and FB indicated that they would. FB referred to the fact that they would be appearing before the JOC on 1 August, and would give a detailed outline at that point of what their investigations had revealed. From: To: Subject: Date: Attachments: Dualta OBroin Ciaran Shanley FW: FOI"s 31 July 2018 16:11:43 FOI_FB.zip     From: Dualta OBroin Sent: 30 July 2018 15:59 To: 'Niamh Sweeney' Subject: FW: FOI's   Hi Niamh We’ve received a few FOI requests about the meetings in the last fortnight. When you have a chance, would you mind having a look over the documents contained in the attached folder and let us know if there’s anything that should be redacted. I don’t believe so as your documents are published – but you might be able to confirm. We’re hopeful to get these released early next week if you could come back before then? Many thanks Dualta   From: Ciaran Shanley Sent: 30 July 2018 15:56 To: Dualta OBroin Subject: FOI's   Dualta, The documents in the attached zip file relate to two FOI requests that we have on hand. Could you forward them to Niamh and ask if there’s anything that should be redacted as commercially sensitive and if she could get back to us as soon as she can. Thanks, —— Ciarán Shanley, Administrative Officer Broadcasting & Media —— Roinn Cumarsáide, Gníomhaithe ar son na hAeráide & Comhshaoil Department of Communications, Climate Action & Environment   29-31 Bóthar Adelaide, Baile Átha Cliath, D02 X285 29-31 Adelaide Road, Dublin, D02 X285 —— T +353 (0)1 6782383 Ciaran.Shanley@dccae.gov.ie www.dccae.gov.ie   “The Department of Communications, Climate Action and the Environment requires customers to provide certain personal data in order to provide services and carry out the functions of the Department. Your personal data may be exchanged with other Government Departments and Agencies in certain circumstances, where lawful. Full details can be found in the Data Privacy Notice, which is available here or in hard copy upon request”.   From: To: Subject: Date: Attachments: Dualta OBroin Ciaran Shanley FW: FOI"s 22 August 2018 16:00:53 C9DE0E70D68E48BB9062472E5DB53466.png FB_Opening Statement_JOC CCAE .docx     Sent from my Windows 10 phone   From: Niamh Sweeney Sent: Tuesday 31 July 2018 11:03 To: Dualta OBroin Subject: Re: FOI's   Hi Dualta, Many thanks for this. I will certainly come back to you before next week. In the meantime, I wanted to share our written submission to the Joint Oireachtas Committee on CCAE ahead of tomorrow’s meeting (attached). I would be grateful if you could share this with the Minister and his team. Best, Niamh Niamh Sweeney   Head of Public Policy, Ireland  Facebook   4 Grand Canal Square, Dublin 2 m: From: Dualta OBroin Date: Monday, July 30, 2018 at 4:00 PM To: Niamh Sweeney Subject: FW: FOI's   Hi Niamh We’ve received a few FOI requests about the meetings in the last fortnight. When you have a chance, would you mind having a look over the documents contained in the attached folder and let us know if there’s anything that should be redacted. I don’t believe so as your documents are published – but you might be able to confirm. We’re hopeful to get these released early next week if you could come back before then? Many thanks Dualta   From: Ciaran Shanley Sent: 30 July 2018 15:56 To: Dualta OBroin Subject: FOI's   Dualta, The documents in the attached zip file relate to two FOI requests that we have on hand. Could you forward them to Niamh and ask if there’s anything that should be redacted as commercially sensitive and if she could get back to us as soon as she can. Thanks, —— Ciarán Shanley, Administrative Officer Broadcasting & Media —— Roinn Cumarsáide, Gníomhaithe ar son na hAeráide & Comhshaoil Department of Communications, Climate Action & Environment   29-31 Bóthar Adelaide, Baile Átha Cliath, D02 X285 29-31 Adelaide Road, Dublin, D02 X285 —— T +353 (0)1 6782383 Ciaran.Shanley@dccae.gov.ie www.dccae.gov.ie   “The Department of Communications, Climate Action and the Environment requires customers to provide certain personal data in order to provide services and carry out the functions of the Department. Your personal data may be exchanged with other Government Departments and Agencies in certain circumstances, where lawful. Full details can be found in the Data Privacy Notice, which is available here or in hard copy upon request”.   Disclaimer: This electronic message contains information (and may contain files), which may be privileged or confidential. The information is intended to be for the sole use of the individual(s) or entity named above. If you are not the intended recipient be aware that any disclosure, copying, distribution or use of the contents of this information and or files is prohibited. If you have received this electronic message in error, please notify the sender immediately. This is also to certify that this mail has been scanned for viruses. Tá eolas sa teachtaireacht leictreonach seo (agus b'fhéidir sa chomhaid ceangailte leis) a d'fhéadfadh bheith príobháideach nó faoi rún. Is le h-aghaidh an duine/na ndaoine nó le haghaidh an aonáin atá ainmnithe thuas agus le haghaidh an duine/na ndaoine sin amháin atá an t-eolas. Murab ionann tusa agus an té a bhfuil an teachtaireacht ceaptha dó bíodh a fhios agat nach gceadaítear nochtadh, cóipeáil, scaipeadh nó úsáid an eolais agus/nó an chomhaid seo. Más trí earráid a fuair tú an teachtaireacht leictreonach seo cuir, más é do thoil é, an té ar sheol an teachtaireacht ar an eolas láithreach. Deimhnítear leis seo freisin nár aims odh víreas sa phost seo tar éis a scanadh.        I'd like to thank the Committee for asking us to be here today to discuss the issues raised in the recent Dispatches programme that aired on Channel 4 on July 17. My name is Niamh Sweeney and I am the Head of Public Policy for Facebook Ireland. I’m joined here today by Siobhán Cummiskey who is Facebook’s Head of Content Policy for Europe, the Middle East and Africa and we are both based in our International Headquarters here in Dublin. We know that many of you who watched the Dispatches programme were upset and concerned by what you saw. Siobhán and I - along with our colleagues here in Dublin and around the world - were also upset by what came to light in the programme, and we fully understand why you wanted to meet with us here today. The safety and security of our users is a top priority for us at Facebook, and we have created policies, tools and a reporting infrastructure that are designed to protect all of our users, especially those who are most vulnerable to attacks online - children, migrants, ethnic minorities, those at risk from suicide and self-harm and others. So, it was deeply disturbing for all of us who work hard on these issues to watch the footage that was captured on camera, at our content review centre here in Dublin, as much of it did not accurately reflect Facebook's policies or values. As our colleague, Richard Allan, said during an interview with the Dispatches team, we are one of the most heavily scrutinised companies in the world. It is right that we are held to high standards. We also hold ourselves to those high standards. Dispatches identified some areas where we have failed, and Siobhán and I are here today to reiterate our apology for those failings. We should not be in this position and we want to reassure you that whenever failings are brought to our attention, we are committed to taking them seriously, addressing them in as swift and comprehensive a manner as possible, and making sure that we do better in future. But first I want to address one of the claims that was made by a CPL staff member on the programme, which was the suggestion that it is in our interests to turn a blind eye to controversial or disturbing content on our platform. This is categorically untrue. Creating a safe environment where people from all over the world can share and connect is core to our business model. If our services are not safe, people won't share content with each other and, over time, would stop using them. Nor do advertisers want their brands associated with disturbing or problematic content, and advertising is Facebook's main source of revenue. We understand that what I'm saying to you now has been undermined by the comments that were captured on camera by the Dispatches reporter. We are in the process of an internal investigation to understand why some actions taken by CPL was not reflective of our policies and the underlying values on which they are based. We also wish to address a misconception about reports that relate to an imminent risk of self-harm or suicide. During the programme, a CPL staff member was asked if a backlog of reports could include reports about people who were at risk of suicide. Their answer was that it could, but this was wrong. Suicide-related reports are routed to a different ‘queue’, so we can get to it quickly queue is the word we use for the list of reports that we have coming in. Reports about suicide are considered high priority, and almost 100 percent of high priority reports during those two months were reviewed within the set timeframe. How We Create and Enforce Our Policies Every day, more than 1.4 billion people from around the world use Facebook. They post in many different languages: everything from photos and status updates to live videos. Deciding what stays up and what comes down involves hard judgment calls on complex issues — from bullying and hate speech to harassment. It’s why we developed our Community Standards with input from outside experts — including academics, NGOs and governments from around the world. We have a Safety Advisory Board which is comprised of leading internet safety organisations on child abuse, domestic violence and internet safety for children and women. Facebook consults with these organizations on issues related to online safety and keeping our community safe. In May of this year, we hosted three “Facebook Forums” in locations across Europe during which we held discussions with human rights and free speech advocates, as well as counter-terrorism and child safety experts. We will be holding more of these over the course of this year. Our Community Standards have been publicly available for many years, and, in April, for the first time, we published the more detailed internal guidelines used by our review teams to enforce them. We decided to publish these internal guidelines for two reasons. First, the guidelines aim to help people understand where we draw the line on nuanced issues. Second, providing these details makes it easier for everyone, including experts in different fields, to give us feedback so that we can improve the guidelines – and the decisions we make – over time. For the content review process, we use a combination of artificial intelligence and reports from people to identify posts, pictures or other content that may violate our Community Standards. These reports are reviewed by our Community Operations team, who work 24/7 in over 50 languages. We also work with several companies across the globe including CPL, the company featured in the Dispatches programme. This year, we have doubled the number of people working on our safety and security teams to 20,000. This includes over 7,500 content reviewers. We are also investing heavily in new technology to help deal with problematic content on Facebook more effectively. For example, we use technology to assist in sending reports to reviewers with the right expertise, to cut out duplicate reports, and to help detect and remove known terrorist propaganda and child sexual abuse images before they’ve even been reported. That said, Dispatches highlighted a number of issues and I want to take you through what we have already done and are doing to improve the accuracy of our enforcement. Actions to Fix Specific Content Errors Highlighted by Dispatches As I highlighted at the outset, some of the guidance given by trainers to content reviewers during the Dispatches programme was incorrect. As soon as we became aware of these mistakes, we took immediate steps to remove those pieces of content from our platform, in line with our existing polices. These included: o A decision not to remove a video depicting a three-year-old child being physically assaulted by an adult. This was a mistake because we know that the child and the perpetrator were identified in 2012. The video should have been removed from our platform at that point.  We removed this video as soon as Dispatches brought this to our attention. Our policies make it clear that videos depicting child abuse are to be removed from Facebook if the child in question has been rescued. In addition to removing this specific piece of content, we are now using media-matching technology to prevent future uploads of the same video to the platform.  There is a narrow set of circumstances under which we would allow the video to be shared – for example, if the child is still at risk and there is a chance the child and perpetrator could be identified to law enforcement as a result of awareness being raised.  In the relatively few cases where this kind of video is allowed to remain on our platform, in line with what I have described above, we apply a warning screen for users and limit its distribution to only those who are 18 years of age or older.  We also send this content to an internal Facebook team known as the Law Enforcement Response team who can contact local law enforcement.  According to Malaysian news reports, that is what happened in this case - a neighbour recognised the child in the video, having seen it on Facebook. In that instance, once we know the child has been brought to safety, it is our policy to remove the video and prevent it from being re-uploaded to our platform by using media-matching software. We recognise that there are a number of competing interests at play when it comes to this type of content: namely the child's safety and privacy, the effect on those who may view the content, and the importance of raising awareness of real world happenings.  However, on foot of the concerns voiced by safety NGOs and others following the Dispatches programme, we are now actively considering a change to this policy and have started an extensive consultation process with external organisations including law enforcement agencies and child safety organisations to seek their views on the exception we currently make for children we believe to be at risk or who could be brought to safety.  It is important to make absolutely clear that we take a zerotolerance approach to child sexual abuse imagery. Whether it is detected by our technology or reported to us, we remove it and report it to the US-based National Center for Missing and Exploited Children (“NCMEC”: http://www.missingkids.com/home) as soon as we find it. NCMEC leads the global, coordinated effort to tackle child sexual abuse imagery which we, other tech companies and law enforcement agencies around the world - including An Garda Síochána - are a part of. We also use photo- and videomatching technology to prevent this content from being reuploaded onto Facebook again, and we report attempts to reupload it to law enforcement agencies where appropriate. o A decision not to remove a video of teenage girls in the UK who were filmed fighting with each other. This has since been removed from our platform.  It is our policy to always remove bullying or teenage fight videos, unless they are shared to condemn the behaviour - and even content shared in condemnation appears behind a warning screen that is only visible to people over the age of 18 so that people know the content may be disturbing.  The user must click through this warning screen if they want to continue to view the content. In the example highlighted in Dispatches, the person who shared the video did so to condemn this behaviour.  However, it is our policy to always remove such content, regardless of whether it is shared to condemn it, if the minor or their guardian has requested its removal. When we learned from the Dispatches team that the mother of one of the teenagers involved was deeply upset and wanted this video removed, we immediately deleted it and took steps to prevent it from being uploaded to the platform again.  o A decision not to remove a post comparing Muslims to sponges and a disturbing meme that read, “When your daughter's first crush is a little negro boy”. These were both violations of our hate speech policy and should have been removed by the reviewer.  Hate speech is never acceptable on Facebook and we work hard to keep it off our platform. These posts were left up in error and were quickly removed once we became aware of them via Dispatches. The meme violates our hate speech policy as it is mocking a hate crime - i.e. depicting violence motivated by a racial bias. We have deleted it and are using imagematching software to prevent it being uploaded again. The post comparing Muslims to sponges violates our hate speech policy as it dehumanising.  We are increasingly using technology to detect hate speech on our platform which means we are no longer relying on user reports alone. Of the 2.5 million pieces of hate speech we removed from Facebook in Q1 of 2018, 38 percent of it was flagged by our technology.  In 2017 the European Commission monitored the compliance of Facebook and other tech companies as part of the Code of Conduct on countering illegal hate speech online and we received the highest score, removing 79 percent of potential hate speech – 89 percent of which was removed within 24 hours. [http://ec.europa.eu/newsroom/just/itemdetail.cfm?item_id=612086] Changes to Processes and Policies As mentioned previously, when writing our policies, we seek input from experts and organisations outside Facebook so we can better understand different perspectives on safety and expression, as well as the impact of our policies on different communities globally. o Flagging suspected under-13s accounts  We do not allow people under 13 to have a Facebook account. If someone is is reported to us as being under 13, the reviewer will look at the content on their profile (text and photos) to try to ascertain their age. If they believe the person is under 13, the account will be put on a hold and the person will not be able to use Facebook until they provide proof of their age.  Since the programme, we have been working to update the guidance for reviewers to put a hold on any account they encounter if they have a strong indication it is underage, even if the report they are reviewing was for something else. o Removing child abuse videos  As flagged above, our policy with respect to (non-sexual) child abuse videos is under review. We have started a consultation process with external organisations like child safety organisations and law enforcement to decide if it is appropriate to continue with our policy of allowing these videos on our platform in the limited circumstances described above (i.e. when they are shared to condemn the behaviour and the child is still at risk). Actions to Address Training and Enforcement of our Content Policies We recognise the responsibility we have to get this right. Content review at this scale has never been done before as there has never been a platform where so many people communicate in as many languages across so many countries and cultures. We work with reputable partners to deliver content moderation services because it enables us to respond more quickly to changing business needs. For example, we may need to quickly increase the number of staff we have in different regions, and the outsourcing model enables us to do that. The training and performance support process in place for all reviewers - including full-time employees, contractors and partner companies - involves three key stages: o Pre-training: which includes what to expect on the job. New team members also learn how to access resiliency and wellness resources, and information on how to connect with a psychologist when they need additional support. o Hands-on learning: this involves a minimum 80 hours’ training with an instructor followed by an extended period of job simulation so that new staff can practice safely in a “real” environment before moving on to deal with real reports. Following hands-on practice, reviewers get a report highlighting areas where they need to improve their accuracy. o Ongoing coaching and refresher training: all reviewers receive regular 1:1 coaching, mentoring and training on an ongoing basis and as policies are updated. However, in light of the failings highlighted by Dispatches, we are making changes to substantially increase the level of oversight of our training by in-house Facebook policy experts, and to test even further the readiness of our content reviewers before they start reviewing real reports. Improvements to Training o We are carrying out an internal investigation with CPL to establish how these gaps between our policies and values and the training given by CPL staff came about.  This is being led by Facebook, rather than by CPL, due to the extremely high priority we attach to this. This investigation began in earnest on Monday, July 23 as, out of an abundance of caution and concern for their wellbeing, CPL encouraged the o o o o o o staff members directly affected by the programme to take some time off. We immediately carried out retraining for all trainers at our CPL centre in Dublin.  We initiated re-training for all CPL trainers as soon as we became aware of discrepancies between our policies and the guidance that was being given by CPL trainers to new staff.  Ongoing training will now continue with twice-weekly sessions to be delivered by content policy experts from Siobhán Cummiskey's team.  CPL is also now directly involved in weekly deep-dive discussions with Siobhán's team on our policies covering issues like hate speech and bullying.  All content reviewers will continue to receive regular coaching sessions and updated training on our policies as they evolve. We revised our training materials used to train content reviewers to ensure they accurately reflect our policies and illustrate the correct actions that should be taken in all circumstances.  This has been done for CPL and for all of our content review centres globally  These materials have been drafted and approved by Facebook only and will continue to be updated by us as our content policies evolve. We are seconding highly-experienced subject matter experts from Facebook to CPL's office for a minimum of six months to oversee all training and provide coaching, reinforcement and mentoring. We are introducing new quality control measures, including new dedicated quality control staff to be permanently assigned to each of our content review centres globally. We are also conducting an audit of past quality control checks at CPL going back for a period of six months to identify any repeat failings that may have been missed.  This will include temporarily removing content reviewers who have made consistent or repeated errors from this type of work until they have been retrained.  We will also continue to deploy spot testing at our review centres. If we find any irregularities around the application of certain policies more broadly, we will test for accuracy using targeted spot checking of all content reviewers to improve accuracy. Finally, we have been in the process of enhancing our entire onboarding curriculum over the past few months and are continuing to do so. The enhancements to our curriculum include even more practice, more coaching and more personalization to help content reviewers focus on areas where they may benefit from additional upskilling. Taking Care of the Reviewers Content review work is not easy and sometimes means looking at disturbing material — and making decisions about what action to take, mindful of both the cultural context and the Community Standards that establish our policies. At Facebook we have a team of four clinical psychologists across three regions who are tasked with designing, delivering and evaluating resiliency programs for everyone who works with graphic and objectionable content. This group also works with our partners who have their own dedicated resiliency teams to help build industry standards. All content reviewers — whether full-time employees, contractors, or partner companies — have access to mental health resources, including trained professionals onsite for both individual and group counseling. There are nine fulltime psychologists who work at our CPL office in Dublin. All reviewers have comprehensive health insurance. We also pay attention to the physical environment where our reviewers work. Because these teams deal with such serious issues, the environment they work in and support around them is important to their well being. Digital Safety Commissioner Bill 2017 Before finishing, I would like to share our thoughts on the Law Reform Commission's (LRC) 2016 proposal to create a Digital Safety Commissioner with statutory take-down powers. As you are all no doubt aware, the LRC's proposal also provided a foundation for Deputy Donnchadh Ó Laoghaire's Private Member's Bill, the 'Digital Safety Commissioner Bill 2017'. We understand the motivation behind the establishment of a Digital Safety Commissioner and have discussed it with many of our safety partners here in Ireland. We also understand the appeal in having an independent, statutory body that is authorised to adjudicate in cases where there is disagreement between a platform and an affected user about what constitutes a “harmful” communication, or to provide a path to appeal for an affected user where we have, in error, failed to uphold our own policies. We would also acknowledge the draft bill's efforts to ensure that its scope is not overly broad, in that an appeal to the Digital Safety Commissioner can only be made by individuals where the specified communication concerns him or her. And we very much see the benefit in a single office having the ability to oversee and coordinate efforts around the promotion of digital safety throughout communities – much of which has been captured in the Government's recently-published Action Plan for Online Safety. It is only through a multi-pronged approach, of which education is critical, that we can begin to see positive changes in how people engage and protect themselves online. In addressing the nature of “harmful communications”, the LRC's report says the following: “While there is no single agreed definition of bullying or of cyberbullying, the wellaccepted definitions include the most serious form of harmful communications, such as... so-called “revenge porn”; intimidating and threatening messages, whether directed at private persons or public figures; harassment; stalking; and nonconsensual taking and communication of intimate images...” We agree with the LRC with respect to all of these types of communication. The sharing of non-consensual intimate images (NCSII), harassment, stalking and threatening messages are all egregious forms of harmful communication and are banned both by our Community Standards and, in some cases, the law. And we fully support the LRC's proposals to create new criminal offences to tackle NCSII and online harassment where those offences are clearly defined and practicable for a digital environment. We have also taken steps to step up how we tackle NCSII that is shared on our own platform - more information on which we have shared here: https://newsroom.fb.com/news/2017/04/using-technology-to-protectintimate-images-and-help-build-a-safe-community/ However, the proposed bill is unclear as to what, precisely, constitutes a “harmful communication”. No definition is included in the draft legislation, but (from the drafting of the bill) it appears that this concept is intended to be broader than content that is clearly criminal in nature - much of which I've outlined above. The exact parameters are left undefined, however, and this will lead to uncertainty and unpredictability. As the LRC wrote in 2016, the internet “enables individuals to contribute to, and shape debates on, important political and social issues, and, within states with repressive regimes, the internet can be a particularly valuable means of allowing people to have their voices heard. Freedom of expression is therefore the lifeblood of the internet and needs to be protected.” The report continues: “...balancing the right to freedom of expression and the right to privacy is a challenging task, particularly in the digital and online context. Proposing heavy handed law based measures intended to provide a remedy for victims of harmful digital communications has the potential to interfere with freedom of expression unjustifiably, and impact on the open and democratic nature of information sharing online which is the internet’s greatest strength.” We agree with the LRC's analysis here. And while it would clearly not be the intention of this bill to impact on free speech in Ireland, the Commissioner's ability to issue a decision ordering the removal of “harmful communications” without allowing an opportunity for the digital service undertaking to appeal, ought to be considered in light of the potential for limiting freedom of expression. Therefore, we believe it is important to have a clear definition of what constitutes a harmful communication included in the legislation. Further, because the right to free expression is expressly protected in our Constitution and because said right can only be limited in narrow circumstances, it is vital that any provisions limiting or impacting upon speech be clearly and strictly defined. In this vein, I would also like to note the extraterritorial jurisdiction which is envisaged in Section 10 of the Draft Bill. While it may not have been intended, this could lead to the scenario where an Irish court is issuing an order in respect of a communication made outside of Ireland, on a digital platform outside of Ireland and in respect of an Irish citizen who is also outside of Ireland. There may well be challenges to how this would work in practice and the impact which it may have. However, as I have said previously, Facebook has put Community Standards in place for a reason: we want our community to feel safe and secure when they use our platform. We are committed to the removal of content that breaches those Standards, and we are keen to continue to engage with this Committee and others as the 'Digital Safety Commissioner Bill 2017' moves into the next phase in the legislative process. Again, I would like to thank the Committee for meeting with us today to discuss these important issues. Thank you. From: To: Subject: Date: Dualta OBroin Ciaran Shanley FW: FOI"s 22 August 2018 16:00:06     Sent from my Windows 10 phone   From: Dualta OBroin Sent: Tuesday 7 August 2018 17:19 To: 'Niamh Sweeney' Subject: RE: FOI's   Hi Niamh Just following up on the below – we’re hoping to get these requests out this week if at all possible. Le dea-mhéin Dualta   From: Dualta OBroin Sent: 30 July 2018 15:59 To: 'Niamh Sweeney' Subject: FW: FOI's   Hi Niamh We’ve received a few FOI requests about the meetings in the last fortnight. When you have a chance, would you mind having a look over the documents contained in the attached folder and let us know if there’s anything that should be redacted. I don’t believe so as your documents are published – but you might be able to confirm. We’re hopeful to get these released early next week if you could come back before then? Many thanks Dualta   From: Ciaran Shanley Sent: 30 July 2018 15:56 To: Dualta OBroin Subject: FOI's   Dualta, The documents in the attached zip file relate to two FOI requests that we have on hand. Could you forward them to Niamh and ask if there’s anything that should be redacted as commercially sensitive and if she could get back to us as soon as she can. Thanks, —— Ciarán Shanley, Administrative Officer Broadcasting & Media —— Roinn Cumarsáide, Gníomhaithe ar son na hAeráide & Comhshaoil Department of Communications, Climate Action & Environment   29-31 Bóthar Adelaide, Baile Átha Cliath, D02 X285 29-31 Adelaide Road, Dublin, D02 X285 —— T +353 (0)1 6782383 Ciaran.Shanley@dccae.gov.ie www.dccae.gov.ie   “The Department of Communications, Climate Action and the Environment requires customers to provide certain personal data in order to provide services and carry out the functions of the Department. Your personal data may be exchanged with other Government Departments and Agencies in certain circumstances, where lawful. Full details can be found in the Data Privacy Notice, which is available here or in hard copy upon request”.