September 12, 2019 - Announcing the Winners of our Nation-Wide Tech Student Writing Competition!

The Institute for Technology Law & Policy is pleased to announce the winners of the inaugural nation-wide Georgetown Law Technology Review Student Tech Writing Competition.

The competition, conducted with generous support from BSA | The Software Alliance, challenged students to address legal, political, and social questions raised by the emergence of artificial intelligence, machine learning, and algorithmic decision-making. 

A panel of judges—comprising representatives from academia, civil society, industry, and government—selected three winners from the more than fifty submissions to the competition. 

The winners are: 

First Place: Lauren Renaud (Georgetown '19), Will You Believe It When You See It: How and Why the Press Should Prepare for Deepfakes

Second Place: Thomas Belcastro (Cornell '19), Getting On Board with Robots: How the Business Judgment Rule Should Apply to Artificial Intelligence Devices Serving as Members of a Corporate Board

Third Place: Theodore Bruckbauer (Northwestern '19), CFIUS and A.I.: Defending National Security While Allowing Foreign Investment

Each winning paper is awarded a cash prize: $4,000 for first place, $2,000 for second place, and $1,000 for third place. In addition to cash prizes, each of the selected papers will be published as student notes in the Georgetown Law Technology Review

Congratulations to our winning authors, and thank you to everyone who submitted excellent papers for this competition!

Thank you, also, to our judges for the 2019 Writing Competition: 

Joshua Banker, Editor-in-Chief, Georgetown Law Technology Review

Professor Julie Cohen, Georgetown Law

Aaron Cooper, VP for Global Policy, BSA | The Software Alliance

Alexandra Givens, Executive Director, Georgetown Institute for Tech Law & Policy

Laura Hillsman, co-Managing Editor, Georgetown Law Technology Review

Logan Koepke, Senior Policy Analyst, Upturn

Dr. Carolyn Nguyen, Director of Technology Policy, Microsoft

Dean Paul Ohm, Georgetown Law

Anant Raut, Senior Counsel, U.S. Senate Committee on the Judiciary

Professor Tanina Rostain, Georgetown Law

Bari Williams, VP of Legal, Business & Policy Affairs, All Turtles


The call for the 2019-2020 academic year competition will be posted within the next week at www.georgetowntech.org/writingcompetition

GLTR Writing Comp Logo.jpg

September 11, 2019 - Georgetown Law Hosts Inaugural We Robot DC Salon

On Wednesday, the Institute hosted the first-ever We Robot DC Salon, a policy-focused evening event featuring leading papers from the We Robot conference, the premier conference focused on the interactions of humans and machines.

The evening drew together academics, policymakers, experts from industry and civil society to engage with leading interdisciplinary research on the cutting edge of robot design, development, and deployment. 

  • Stanford Professor Abby Everett Jacques discussed her paper, Why the Moral Machine is a Monster, addressing the challenges created by taking a limited view of ethics when conceptualizing and designing robotic systems. Georgetown’s Julie Cohen provided insight and led discussion. 

  • Cornell Professors Karen Levy and Solon Barocas presented their work, Reap What You Sow? Precision Agriculture & The Privacy of Farm Data, covering the system-wide influence of big data collection and use in farming communities. The conversation was led by Kristen Thomason from the University of Windsor. 

  • To finish the evening, Stephanie Ballard and We Robot organizer Ryan Calo presented their paper, Taking Futures Seriously: Forecasting as a Method in Robotics Law & Policy, examining how to anticipate potential futures when designing policy and law for robotic systems. Dr. Anne Washington of NYU led the conversation. 

The evening took the opportunity to honor Ian Kerr, a leading scholar in robotics law and one of the co-founders of We Robot, who passed away in August. The Canada Research Chair in Ethics, Law and Technology at the University of Ottowa, Ian was one of the earliest and leading academics to explore the social and ethical implications surrounding the interactions of humans and machines. You can read more about him and his work in this wonderful tribute by WeRobot co-founded Michael Froomkin, and another remembrance by Ian’s colleague Michael Geist. The Tech Institute announced a donation to the Ian Kerr Memorial Fund to support student scholarships in his honor.

More information on the We Robot DC Salon is available here.

Information on the full conference, including registration details for the April 2020 conference in Ottawa, can be found here

Many thanks to Microsoft for their sponsorship to make this evening possible.

August 23, 2019 - Institute Associate Jeff Gary Publishes New Paper on Platform Moderation for Knight First Amendment Institute

In a new paper, Institute Associate Jeff Gary and former FTC Chief Technologist Ashkan Soltani argue that addressing platform moderation concerns must begin with an examination of the online advertising models and financial incentives of platforms. 

The article, published as part of the Knight First Amendment Institute’s Free Speech Futures series, argues that platforms do not have proper incentives to identify and remove controversial content because “the business models of the platforms themselves encourage and reward divisive or controversial content” since that content drives the most engagement, and therefore provides the platforms with the greatest potential revenue. 

Instead, any meaningful solution to content concerns “must address the ways in which platforms collect data and curate content in order to maximally harvest users’ attention and sell advertisements, the main method of monetization online.” 

This approach comes as policymakers and academics grapple with the ongoing challenges of widespread harmful content online and marks a departure from many current legislative and regulatory proposals, many of which focus on the direct control of particular types of content or modes of speech. 

The full paper is available on the Knight First Amendment Institute website


August 1, 2019 - Executive Director Alexandra Givens Speaks at National Employment Conference on Algorithmic Fairness & the Rights of People with Disabilities

Our Executive Director Alexandra Givens spoke today about her work on algorithmic fairness and the rights of people with disabilities at the National ILG Conference, the largest convening for federal contractors focused on affirmative action, equal employment opportunity, diversity and inclusion.

Givens focused on the use of AI-driven tools that are currently being marketed for sourcing job candidates, evaluating applicants, and evaluating employee performance. Drawing on case studies, she highlighted the particular risks such tools may pose for excluding individuals with disabilities. Products purporting to conduct “sentiment analysis” of candidates’ personalities based on recorded video interviews, for example, are significantly likely to disadvantage individuals whose facial presentation may differ from perceived norms. Products that ask employers to create “success profiles” for effective job candidates may replicate selection bias in those profiles, such as the absence of people with disabilities.

Givens cautioned that specific challenges make it difficult to detect and address these biases for people with disabilities. For example, people with disabilities often don’t disclose their disability status to a future employer, making it hard to track whether they are being disproportionally screened out. In addition, the wide variety of disabilities make it virtually impossible to create fully representative training data that can train algorithms in a more inclusive way.

The Institute recently announced an extensive new project to explore algorithmic fairness and the rights of people with disabilities. Among other things, the project is assessing how rights guaranteed under the Americans with Disabilities Act, the Rehabilitation Act’s Sections 501, 503 and 504, and other statutes protect employees in this context — and the obligations on employers to validate whether such tools may have discriminatory effects. To learn more about the project or join its mailing list, click here.

July 31, 2019 - Faculty Director Julie Cohen Cited in New Senate Bill to Protect Voter Information

Work by our Faculty Director Julie Cohen was cited today in the findings for new proposed federal legislation to protect voter privacy.

The Voter Privacy Act, S. 2398, introduced by U.S. Senator Dianne Feinstein, seeks to regulate the information about voters collected and used by political campaigns in federal elections. It would allow voters to review personal information collected by political organizations, require organizations to notify voters when they receive personal information, and permit voters to require deletion or prevent the transfer or sale of their data. Voters would be able to instruct websites like Google and Facebook to withhold their data profiles from political organizations to avoid targeted advertising. The Federal Election Commission would oversee enforcement of the Act.

The “findings” section of the bill lays out the case for why such measures are needed, providing important context that would likely be used if the enacted bill were ever challenged in court. The findings include direct quotes from Professor Cohen’s work on how digital communications flows can be used to manipulate individual behavior and emotion:

“A forthcoming publication by Julie E. Cohen titled ‘‘Between Truth 15 and Power’’ describes the phenomenon as follows: ‘

‘The operation of the digital information environment has begun to mimic the operation of the collection of brain structures that mid-twentieth-century neurologists christened the limbic system and that play vital roles in a number of precognitive functions, including emotion, motivation, and habit-formation,’’ and observed that ‘‘today’s networked information flows are optimized to produce what social psychologist Shoshana Zuboff calls instrumentarian power: They employ a radical behaviorist approach to human psychology to mobilize and reinforce patterns of motivation, cognition, and behavior that operate on automatic, near-instinctual levels and that may be manipulated instrumentally’’.”

The Voter Privacy Act in one of several bills Senator Feinstein has introduced in an effort to respond to Russian interference in the 2016 election. Other bills include the Prevention of Foreign Interference with Elections Act, the Bots Disclosure and Accountability Act and the Foreign Agents Disclosure and Registration Enhancement Act.

It has been referred to the Senate Rules Committee.

July 26, 2019 - Distinguished Fellow Gigi Sohn Speaks to NPR About New Maine Law to Protect Broadband Privacy, DOJ Approval of Sprint-T-Mobile Deal

The Institute’s Distinguished Fellow Gigi Sohn spoke with NPR twice this week to discuss two developing issues in tech and communications policy.

New Broadband Privacy Law in Maine

In a July 27 interview, Sohn spoke about new legislation passed in Maine to restore internet privacy protections following Congress’s repeal of the Obama-era broadband privacy rules. The act, which passed the Maine legislature with strong bipartisan support in May, requires broadband internet service providers (ISPs) to ask customers for permission before using, selling or sharing customer’s personal information. The bill defines such information to include an individual’s browsing history, application usage, geolocation, the content of communications, device identifiers and the origin and destination internet protocol addresses.

Maine’s new law requires ISPs to give customers a “clear, conspicuous and nondeceptive notice” of the customer’s rights, and prohibits ISPs from refusing service to customers who withhold consent. It also prohibits them from offering financial or other incentives for customers to opt-in.

ISPs may use or share information that does not fall within the definition of customer personal information unless an individual opts out.

The Act regulates approximately 80 broadband internet service providers that operate in Maine, covering customers “that are physically located and billed for service received in the State.” The law takes effect in July 2020.

The new law reflects growing momentum to adopt consumer privacy protections at the state level in the absence of comprehensive federal privacy protections. With its opt-in regime, Maine’s new law has been widely viewed as the strongest ISP consumer privacy measure in the country.

On Saturday, Sohn spoke with NPR to explain the impact of the bill, which she testified in support of in April of this year. Listen to her interview here.

Sprint-T-Mobile Merger Approved

The prior day, Sohn spoke with NPR about the Justice Department’s approval of the proposed merger between Sprint and T-Mobile, a $26 billion deal that has been the subject of protracted regulatory review.

To gain approval, the companies agreed to sell certain assets to Dish Network, including wireless spectrum and Boost Mobile, Sprint's prepaid phone business. The deal includes requirements that the merged company make available to Dish 20,000 cell sites and hundreds of retail locations. The concessions would effectively make Dish the country's fourth-largest wireless carrier, although it would continue to rely on T-Mobile for use of its network. The merged company must give Dish access to the T-Mobile network for seven years following the deal, while Dish builds its own 5G network.

Along with other consumer advocates, Sohn has expressed continued concern with the deal. "The concessions are not enough to make sure there's a strong, viable, fourth national competitor," she told NPR.

In a statement, she elaborated:

“Given incontrovertible evidence of higher prices and reduced competition, Assistant Attorney General Makan Delrahim should have blocked this merger. . . The state AGs who sued to block the merger shouldn’t be fooled by this weak attempt to maintain competition in the mobile wireless market. . . A new mobile wireless entrant that starts with zero postpaid subscribers and that must rely on its much bigger rival, the new T-Mobile, just to operate is not a competitor. It’s a mobile Frankenstein.”

Earlier this year, the Institute hosted a debate on Capitol Hill which saw representatives from Sprint and T-Mobile debate opponents of the deal.

Sohn later testified against the deal before the House Judiciary Committee’s Subcommittee on Antitrust, arguing that the merger would negatively impact consumers and competition.

Attorneys general of 14 states and the District of Columbia have sued to stop the deal, arguing it will increase consumer wireless costs by at least $4.5 billion a year.

JULY 9, 2019 - INSTITUTE FACULTY DIRECTOR ANGELA CAMPBELL TESTIFIES BEFORE THE SENATE JUDICIARY COMMITTEE AT HEARING ON “PROTECTING INNOCENCE IN A DIGITAL WORLD”

On Tuesday, our Institute Faculty Director Angela Campbell testified before the Senate Judiciary Committee at a hearing on “Protecting Innocence in a Digital World.”

Professor Campbell spoke in support of updating the Children’s Online Privacy Protection Act (COPPA), passing new legislation relevant to current online practices, passing the Do Not Track Kids Act of 2019, and fully utilizing the Federal Trade Commission’s (FTC) existing authority to enforce COPPA.

As a professor and clinical director at Georgetown Law, Professor Campbell has spent thirty years working to improve children’s media environment, representing nonprofit organizations such as the Campaign for a Commercial-Free Childhood and the Center for Digital Democracy. The hearing focused on harms affecting children online, and the role of the federal government in addressing such concerns.

Highlighting the roots of risks plaguing children online, Professor Campbell emphasized two issues: 

“First, the business models of the dominant tech companies [are] designed, not to protect children or nurture children, but to attract large numbers of users, including children, and keep them online as long as possible. . .

Second, the government has failed to adopt sufficient safeguards for children and has not effectively enforced the safeguards that do exist . . . COPPA has not kept up with [new technological developments]."

Professor Campbell noted that, "Passing the Do Not Track Kids Act of 2019, which was introduced by Senators Markey and Hawley, would be a good first step” to updating online protections. She also called for better enforcement of existing laws.

“Even in the absence of new legislation, the FTC can and should act to better protect children's privacy. The FTC should use its authority under Section 5 to hold platforms responsible when they characterize apps or other content as appropriate for children when they are not and, conversely, when [apps] say their services are not appropriate for children and yet they know there are a lot of children using them anyhow."

Additionally, Congress should consider emerging issues around “facial recognition, artificial intelligence, the internet of things, and other new technologies.”

Under Professor Campbell’s guidance, Georgetown’s IPR Tech and Communications Clinic has filed complaints with the FTC flagging online practices that violate COPPA. In December, the clinic filed a complaint with the FTC requesting an investigation into whether Google's marketing of apps directed to children in the Google Play Store violates the FTC Act’s prohibition on deceptive and unfair practices.

You can watch Professor Campbell’s full testimony here and read her written testimony here.

Read more about the IPR Tech and Communications clinic here.

Prof. Angela Campbell testifies before the Senate Judiciary Committee on children’s privacy, July 2019

Prof. Angela Campbell testifies before the Senate Judiciary Committee on children’s privacy, July 2019

June 28, 2019 - Institute hosts event on Wiki AI: looking under the hood of applying ethics to machine learning

On Friday, the Tech Institute, along with the Center for Democracy & Technology, welcomed the the Wikimedia Foundation to host “Wiki AI: Looking Under the Hood of Applying Ethics to Machine Learning”.

The workshop provided civil society and congressional staff an in-depth, open forum to discuss new algorithmic tools developed by Wikimedia to assist with governance and moderation on its web platforms. The tools help editors flag troubling content, automatically translate existing Wiki pages, assist humans with edits, and automate other functions. The dialogue with Wikimedia’s technical, research, and policy staff furthered the ongoing conversation about best practices for developing and implementing new AI tools, the potential of new automated processes to reinforce existing bias, and provided new insight into what policymakers can learn from Wikimedia’s experience to inform the broader discussion around the responsible development and deployment of artificial intelligence.

Representatives from Wikimedia provided insight into the values of the Foundation and how engineers strive to develop new tools in line with Wikimedia’s commitment to openness and transparency. Key topics included how Wikimedia balances the accuracy of its content against concerns of entrenching bias through automated editing, ensuring that Wikimedia remains a human-centered platform while also enabling new workflows, and deploying tools in a way that respects the existing community while also making editing and moderation a more inclusive process. 

Throughout the morning, panelists especially highlighted four major themes:

  • Research in the Service of Free Knowledge: Wikimedia increasingly serves as a global knowledge repository. They are researching their knowledge gaps and the integrity of that knowledge to effectively develop technology that supports the world’s free knowledge infrastructure.

  • Developing Ethical AI: How can Wikimedia’s developers incorporate ethical guidelines into the nuts-and-bolts practice of product design?

  • Transparency Isn’t Enough: Wikimedia’s machine learning tools must balance quality control and efficiency with fairness. How the Wikimedia community of contributors interacts with these tools illuminates the power of auditing machine learning tools and moving from transparency to empowerment and participation. 

  • Treasure In; Treasure Out: Wikimedia’s intentional approach to design and user data takes lessons from human-to-human processes of building trust to strive for immersive feedback and equitable value exchange between users and AI systems.

Thanks to our partners at the Wikimedia Foundation and the Center for Democracy & Technology as well as Georgetown Law’s Amanda Levendowski for collaborating with us on this event. More information about Wikimedia’s use of AI on its platform is available at https://www.mediawiki.org/wiki/Wikimedia_Product/Perspectives/Augmentation

June 12, 2019 - Institute Fellow Deloris Wilson Selected as 2019 Atlantic Fellow for Racial Equity

Congratulations to our fellow Deloris Wilson, who was just selected as a 2019 Atlantic Fellow for Racial Equity!

The Atlantic Fellows program brings together leaders on the frontlines of struggle for racial equity in South Africa and the U.S. to explore, imagine, experiment and build long-term solutions for impactful change. The inaugural cohort of 19 fellows includes authors, community organizers, documentary filmmakers, faith leaders, scientists and other leaders who work to challenge anti-Black racism and build the policies, institutions and narratives needed for a more equitable future.

For two years, Deloris has led the Tech Institute’s work building BEACON: The D.C. Women Founders’ Initiative, a community-led initiative to make D.C. a leading hub for diverse women entrepreneurs. In partnership with the D.C. Mayor’s Office, Google, and a broad coalition of community leaders, BEACON studies gaps in the ecosystem supporting women entrepreneurs and works to fill them by mobilizing businesses, government and the community.

Among other projects, BEACON runs an annual grant program that funds innovative projects to support D.C.’s women entrepreneurs. It operates D.C.’s largest directory of women-owned businesses and resources for women entrepreneurs. BEACON hosts or co-hosts regular programming, leads campaigns to address issues affecting women entrepreneurs, identifies speaking and vending opportunities for women business owners, and produces a biweekly newsletter showcasing resources and opportunities. BEACON is particularly focused on elevating the voices of women entrepreneurs of color, who are historically underfunded and under-supported despite being the largest group of women-owned business owners in the District.

From 2016-2019, BEACON was incubated at the Tech Institute, where we conducted strategic and operational work to build the network, and led academic research on ways to improve D.C.’s ecosystem for diverse women founders. In that time, BEACON has grown to serve over 3,000 entrepreneurs, funded 30 community organizations, and shared its findings locally and nationally. Our academic work culminated in Deloris’s 2018 report, “Building Inclusive Ecosystems with Intentionality: A Strategy to Enhance Support for D.C.’s Women Founders.”

Despite BEACON’s strong roots in D.C., our research into building effective support systems for diverse entrepreneurship has wide application. We’ve presented our work before the National Women’s Business Council, the Senate Committee on Small Business and Entrepreneurship, the United Nations Foundation, the U.S. Chamber of Commerce Foundation, and other national organizations. The Tech Institute’s work on BEACON was generously supported by a philanthropic gift from Google.

After two and a half years of incubation, we’re proud that BEACON has now grown into an independent 501(c)(3) organization, led by D.C.-based entrepreneurs and community leaders and supported by an Advisory Board of leading entrepreneurs, VCs, service providers, policy experts and more. Deloris has become BEACON’s Chief Strategy Officer. Our Executive Director, Alexandra Givens, remains deeply involved on BEACON’s board. You can read more about BEACON’s ongoing work here.

We are deeply proud of our work building BEACON, and doubly so that Deloris will continue her passionate leadership on racial justice issues both with BEACON and through her Atlantic Fellowship. Congratulations Deloris!

June 6, 2019 - Announcing the Iron Tech Lawyer Invitational!

Are you a student or professor interested in creating legal tech or data science solutions to help the public interest? Read about the new Iron Tech Lawyer Invitational, which we’re proudly announcing today!

ITL Invitational.png

Georgetown Law’s Institute for Technology Law & Policy and the Justice Lab at Georgetown Law are pleased to announce the inaugural Iron Tech Lawyer Invitational, a national competition for student-created legal tech solutions that help bridge the justice gap.

Student teams from qualifying universities are invited to a one-day pitch competition in Washington, D.C. to showcase a legal tech or data analysis tool they have developed for a pro bono organizational client.

The students must complete the work in an academic course, clinic, or supervised independent study during the 2019-2020 academic year. Client organizations can include legal services organizations or other non-profits focused on assisting people with civil legal problems.

Qualifying universities are invited to submit one student project for the Invitational. Student teams will travel to Washington, D.C. during the week of April 12, 2020 to present their projects at the Iron Tech Student Invitational, hosted at Georgetown Law.


PRIZE

Projects will be evaluated by a panel of experts in access to civil justice, legal design and technology. The winning team will be awarded $5,000 in funding support to advance or complete their technology or data science solution.


ELIGIBILITY

The Iron Tech Lawyer Invitational is designed to encourage the creation of academic courses focused on the thoughtful development of technology and data-driven solutions to help improve the civil justice system. Student teams must be supported by a professor, and complete their project in an academic course, clinic or independent study.

Professors who are interested in sending a student team to the Invitational must meet the following criteria:

  • The students must complete a student project in an academic program, i.e. in an academic course, clinic, or supervised independent study.

  • The student project must involve the creation of a technology tool or data project that strengthens legal service delivery or otherwise improves access to the civil legal system.

  • The student project must be developed for a “client” that is a non-profit legal services provider or other non-profit that assists people with their civil legal problems.

  • The student project must be supported by a faculty sponsor, such as the teacher of the course or supervisor of an independent study.

  • The student project must be completed during the 2019-2020 academic year.

  • Only one student project may be submitted per university.


APPLICATION PROCESS

(1) Faculty Interest Form. Professors who are interested in sending a student team to the Iron Tech Invitational must complete a School Interest Form by July 15, 2019. (This form may not be submitted by students). The Interest Form is intended as a high-level expression of interest; you do not yet need to select which students you will send, and client organizations and specific projects need not be identified.

(2) Follow-Up. Professors who have submitted a School Interest Form will be contacted by the organizers to discuss the competition, shared pedagogical goals, and eligibility.

(3) Faculty Application. Professors must submit a School Application to secure a slot for one student project from their university. At this stage, client organizations and specific student projects must be identified. If the professor is supervising multiple student projects, they need not have selected which student project they will send to the Invitational. We expect that many professors will run their own mini-Iron Tech Competition within their class, program, or between courses offered within their university to select which student team proceeds to the Invitational.

(4) Selection of Qualifying Schools. The Invitational’s organizers will select 5-8 qualifying universities who will send a student project of the university’s choosing to the Invitational.

(5) Submission of Final Student Projects. Qualifying universities will notify the Invitational’s organizers of the student project they have chosen to represent them, and submit a link to the final project and supporting documentation.


COMPETITION

Projects will be judged by a panel of experts in access to justice, legal design and technology. Award criteria will include:

  • Usefulness

  • Completeness

  • Ambition & Creativity

  • Design

  • Student/Team Presentation


FINE PRINT

Professors are responsible for identifying and securing client organization(s) for the student projects, and are solely responsible for the relationship with any client organization. Schools must provide the necessary software to develop the application.

The organizers make no representation as to the accuracy, or suitability for use, of student projects submitted to the Iron Tech Invitational. Projects are not the work product of Georgetown Law or the organizers. The organizers reserve the right to amend the application or competition rules, and will provide notice to applicants of any changes.


IMPORTANT MATERIAL


QUESTIONS?

See our FAQs, or email TechInstitute@law.georgetown.edu


PRESENTED BY:

The Iron Tech Invitational is made possible by generous support from the Bigglesworth Family Foundation.

Screen Shot 2019-05-31 at 2.57.52 PM.png

The Iron Tech Invitational is made possible by generous support from the Bigglesworth Family Foundation

June 3, 2019 - Georgetown's Tech & Communications Clinic Files FTC Comment on Protecting Children's Privacy

Students in Georgetown’s IPR Tech & Communications Clinic today filed comments with the FTC as part of the agency’s hearing on Competition and Consumer Protection in the 21st Century: The FTC’s Approach to Consumer Privacy.

The comments, filed on behalf of clinic clients the Center for Digital Democracy and Center for Commercial Free Childhood, focused on protecting the privacy of children and teens:

“While the Children’s Online Privacy Protection Act (COPPA) is intended to protect the privacy of children under age 13, it is no longer up to the task.

“COPPA’s underlying assumption is that parents will be able to protect their children’s privacy if companies give notice of their privacy practices and do not collect personal information until unless the parent gives consent. But this no longer works. Most parents do not read privacy policies, and even if they do, many do not provide the information needed for informed consent.

Given the unprecedented amount of data being collected, the sophistication of data mining techniques, and the lack of transparency, most people lack a sufficient understanding of scope of the data collected and how it could be used. Moreover, because the FTC has not effectively enforced COPPA, many companies feel free to ignore COPPA’s requirements.”

The clinic’s comment elaborates the many ways in which new technologies have made it harder to protect children’s privacy in the 20 years since the Children’s Online Privacy Protection Act (COPPA) was enacted. Digital technologies have become a pervasive presence in the life of children and teens, and massive-scale data-mining and other techniques have made “informed consent” an unworkable regime to protect users’ privacy.

The comment notes shortcomings in the FTC’s current enforcement of COPPA, arguing that the FTC rarely brings enforcement actions under COPPA even though many companies fail to comply with COPPA requirements. It urges the FTC to make public the information provided by “safe harbor organizations”, which exist to certify compliance with COPPA but often do not rigorously enforce their guidelines.

The comment argued that new legislation is also needed:

“New legislation, such as the bi-partisan Markey-Hawley bill, is needed to address COPPA’s short comings.

Any legislation must include developmentally-appropriate protections for teens, because COPPA only covers children under age 13. The legislation should also prohibit practices that may be harmful to children, rather than requiring parents to read and try to understand the impact of multiple privacy policies.

“Until such legislation is passed, however, the FTC can and should do more to better protect children’s privacy. Specifically, we urge the FTC to undertake more enforcement actions, to enforce COPPA’s notice requirements, and to fix problems with the COPPA safe harbor program.”

The IPR Tech & Communications Clinic has for decades been one of the most active watchdogs tracking enforcement of children’s privacy laws.

In December 2018, the clinic filed a complaint with the FTC requesting an investigation into whether Google's marketing of apps directed to children in the Google Play Store violates the FTC Act’s prohibition on deceptive and unfair practices.

In October 2018, the clinic partnered with the Institute in hosting the conference “COPPA at 20: Protecting Children’s Privacy in the Digital Age.”

The FTC’s hearing is part of an extensive series of hearings the FTC is convening on Competition and Consumer Protection in the 21st Century. The agency’s opening hearing took place at Georgetown Law in September 2018. Georgetown faculty and fellows have testified at a number of the hearings, including the April 9-10 FTC hearing on consumer privacy in association with which today’s comments were filed.

You can read the clinic’s full comment here.

May 31, 2019 - Institute Announces New Project on Algorithmic Fairness for People with Disabilities

By Alexandra Reeve Givens, Executive Director

I’m proud to announce the launch of a new multi-year project at Georgetown’s Institute for Tech Law & Policy on algorithmic fairness and the rights of people with disabilities.

Supported by the Ford Foundation, the project is designed to analyze the impact of algorithmic decision-making on people with disabilities—in employment, benefits determinations, and other settings where AI-driven decision-making touches people’s lives. The project will assess specific areas of risk, analyze gaps in existing legal and policy protections, and forge cross-disciplinary collaborations to center the perspectives of people with disabilities in efforts to develop algorithmic fairness and accountability. You can read more about our approach below.

Our Goal

This project seeks to add an important contribution to the growing conversation around fairness, accountability and transparency in machine learning. Despite increasing focus on ethics in AI, few AI scholars or policy experts are considering the unique risks and impacts of algorithmic decision-making for the millions of individuals affected by disability. In turn, disability rights advocates and regulators are just starting to consider how machine learning may impact this community.

This troubling gap exists even though people with disabilities are, in many ways, disproportionately vulnerable to the threats of algorithmic bias. Accommodation requirements, gaps in an individual’s employment history, or the need for flexible work shifts may all cause automated systems to penalize individuals. Programs that evaluate employees based on sentiment analysis may down-rate those whose expressions vary from an algorithm’s perceived “norm”. Job screening programs that rely on timed answers may penalize candidates who rely on assistive technologies. These are but a handful of examples.

While the emerging literature around AI bias gives some useful context for disability rights, the legal and policy framework for people with disabilities requires specific study. For example, the Americans with Disabilities Act (ADA) requires employers to adopt “reasonable accommodations” for qualified individuals with a disability. But what is a “reasonable accommodation” in the context of machine learning and AI? How will the ADA’s unique standard interact with case law and scholarship about AI bias against other protected groups? When the ADA governs what questions employers can ask about a candidate’s disability, how should we think about inferences from data the employer otherwise collects?

The Institute’s program on algorithmic fairness for people with disabilities seeks to address these questions. In partnership with other civil society organizations and our Project Advisory Committee, it will foster collaborative engagement, conduct legal and policy analysis, and produce materials to shape employer practices, inform potential enforcement actions, and empower individuals to know their rights.

Our Approach

Our initial step is to hire a Program Director who brings deep experience in disability rights. We are eager to find a talented leader who can direct the project’s research agenda, lead stakeholder engagement, and oversee and execute the project work in collaboration with me and the Institute’s other team members. Individuals who are personally affected by disability are particularly encouraged to apply. The hiring notice is available here. Please share it widely!

Our second step is to continue developing our Project Advisory Committee, which is currently in formation. The Project Advisory Committee consists of disability rights experts, including those living with disability themselves, AI experts and other individuals who will help inform the project work. The Project Advisory Committee will be finalized later this year, after the Program Director has been hired and has had an opportunity to weigh in. If you have recommendations for the Project Advisory Committee or would like to be considered, please contact us using this form.

Finally, this project will operate through extensive collaboration with other stakeholders, including individuals and organizations focused on disability rights, those at the intersection of disability rights and the rights of other marginalized communities, and those working on algorithmic fairness and accountability. The project’s key goal is to foster information exchange and knowledge sharing between these communities, with an eye to prioritizing issues, inspiring collaborations, and developing actionable work. We are committed to a consciously intersectional approach, working to center the experience of multiply-marginalized communities and advance equity for the most marginalized individuals. 

Join Us!

Are you an experienced disability rights lawyer who cares about these issues - or do you know someone who is? Please review the Job Posting for our Program Director and apply! Applications will be considered on a rolling basis starting June 3, 2019.

Do you have thoughts about this project, or would you like to keep up to date on the latest developments? Please join our mailing list or contact us using this form.

We look forward to engaging with you in this work.

May 21, 2019 - Georgetown Hosts Biennial Conference of the Partnership for Progress on the Digital Divide

This week, the Institute is hosting the biennial conference for the Partnership for Progress on the Digital Divide (PPDD).

The international conference will feature keynote speakers Vint Cerf, widely known as one of the “Fathers of the Internet”, Larry Irving, former Head of the National Telecommunications Infrastructure Administration (NTIA), and policymakers from around the globe.

On Wednesday at 9.30 a.m., FCC Commissioner Geoffrey Starks will give his first speech since being confirmed as a Commissioner, followed by a fireside chat with Gigi Sohn, now a Distinguished Fellow at the Institute.

Later in the week, the Institute’s Executive Director, Alexandra Givens, will host a panel on Algorithmic Bias and the Digital Divide. Other panels will cover the recently-introduced Digital Equity Act, the 2020 Census, the use of Open Source Software to Increase Digital Engagement, and a wide range of other subjects.

A write-up of the conference is available here. The full list of speakers is available at www.ppdd.org/conferences/ppdd2019.

Founded in 2002, PPDD addresses the digital divide, referring to the still-existing gap between populations who have access to information and communications technology, and those who lack access. Currently the the only academic professional organization in the world focused on these issues, PPDD brings together policymakers, academics and practitioners who work on closing the digital divide and address the many other challenges and opportunities presented by the digital age.


SHOWCASING UNIVERSITY LEADERSHIP IN TECH & SOCIETY

The conference will take place from May 22-24, 2019 on Georgetown’s Main Campus, and will feature a number of Georgetown experts in key speaking roles, as well as practitioners and policymakers representing academia, government, industry, and the nonprofit sector.

Georgetown’s Provost, Robert Groves, will open the conference on Wednesday, May 22, along with featured panels by faculty members and practitioners from the Tech Institute, the Beeck Center for Social Impact & Innovation, the McDonough School of Business’s Center for Business & Public Policy, and other research hubs.

Georgetown’s representatives at PPDD are key leaders in the university’s new Initiative on Tech & Society, which focuses on developing innovative solutions at the intersection of ethics, policy and governance.

“As technology transforms how we apply for jobs, access opportunities, and engage with the world around us, questions of access and affordability are paramount,” said Alexandra Givens, Executive Director of the Tech Institute, and a co-host of the conference. “We all must engage on the question of how technology can be deployed to create new opportunities--not deepen existing inequality. We are proud to host this important convening at Georgetown, where many of our faculty and policy centers are deeply engaged on this key issue.”

LEADING EXPERTS CONVENE AT GEORGETOWN

This year’s conference at Georgetown marks the 25th anniversary of the recognition of the digital divide through social scientific research, and is set to be the largest ever worldwide gathering of experts on technology, disability, and the digital divide.

The speaking sessions of the conference will be available to the public via livestream at www.ppdd.org.

Individuals interested in attending the conference may register at www.ppdd.org.

Members of the press are requested to contact Susan Kretchmer at Susan.Kretchmer@ppdd.org.

May 16, 2019 - Privacy Center Releases Two Major Reports on the Use of Facial Recognition Technology

Today, the Georgetown Center on Privacy & Technology released two reports detailing the widespread use and misuse of face surveillance by police departments nationwide. The reports build on the Center’s groundbreaking work on face recognition, The Perpetual Line-Up, which drew national attention for its finding that half of all adult Americans have their identities used in police face recognition databases.

In the first new report, Garbage In, Garbage Out, CPT reveals that police departments regularly use altered images to generate face recognition matches when images of suspects don’t provide results. To generate leads, police departments submit blurred photos, parts of faces swapped with other photos, and in some cases, submit photos of celebrities that investigators believe look similar to suspects. The report explains that most police departments do not have any prohibitions on using face surveillance systems in this way:

“There are no rules when it comes to what images police can submit to face recognition algorithms to generate investigative leads... The stakes are too high in criminal investigations to rely on unreliable—or wrong—inputs… Unfortunately, police departments' reliance on questionable probe photos appears all too common.”

The second report, America Under Watch, documents the widespread deployment of face surveillance systems in major U.S. cities, such as Chicago, New York, and Washington. The report demonstrates that major police departments have the ability to “scan live video from cameras located at businesses, health clinics, schools, and apartment buildings.” Even departments that claim to not use face surveillance have “paid to acquire and maintain the technology for years.”

The reports come as policymakers and industry are grappling with the reality of face surveillance. Earlier this week, the City of San Francisco voted to ban the use of facial recognition technology by police and city agencies. Microsoft, itself a developer of facial recognition systems, has called for regulations of the technology.

The reports are the work of the Privacy Center’s Clare Garvie (L’16) and Executive Director Laura Moy.

Garvie will testify about the report at a hearing on Wednesday May 22nd before the U.S. House of Representatives’ Committee on Oversight and Reform.

Read the reports at FlawedFaceData.com and AmericaUnderWatch.com. And for a robust conversation on Twitter, check out @ClareAngelyn, @LauraMoy, and @GeorgetownCPT.

April 24, 2019 - Georgetown Students Showcase Legal Apps Created to Improve Access to Justice

Today, our Tenth Iron Tech Lawyer Competition showcased apps created by Georgetown Law students to improve access to justice.

Created in our Georgetown Law class on Technology, Innovation and Law, the apps addressed a range of social and legal challenges with innovative new approaches. Students presented their work in a final pitch competition attended by experts from the access to justice community, with over 200 viewers casting votes online.

The winning app was the Whitman-Walker Health D.C. Name Change App, which helps D.C. adults who identify as transgender or gender expansive to change their legal names. The app transforms a lengthy, confusing process into a convenient, time-saving experience by populating three forms: a Name Change Petition, a Fee Waiver Request, and a Motion to Seal their Petition.

The award for excellence in design went to The Legal Check Up app, designed for Georgetown’s Health Justice Alliance to identify potential legal concerns for cancer patients in an attempt to provide a more comprehensive approach to healthcare.

The social media prize (for greatest popular vote) went to Navi-Gator, which helps people required to register after conviction determine how to comply with Illinois law. The app features a GPS-driven map that determines where such a person can live, work, and travel, and connects them to resources and a user community.

The class’s other exceptional projects were:

You can read more about the apps and watch the student presentations at www.irontechlawyer.com, and on Twitter following @GtwnIronTech and #IronTechLawyer.

Students, faculty and judges at the 2019 Iron Tech Lawyer competition

Students, faculty and judges at the 2019 Iron Tech Lawyer competition

April 24, 2019 - Institute Fellow Gigi Sohn Testifies Before Maine Legislature on Broadband Privacy

Our Institute Fellow Gigi Sohn testified today before the Maine Legislature’s Joint Committee on Energy, Utilities and Technology in support of legislation that would require broadband internet access providers to protect their customers’ privacy.

Sohn, who has been a public interest advocate in technology policy for over 30 years, was a Counselor to former FCC Chairman Tom Wheeler when the FCC adopted the 2016 Broadband Privacy Rules, on which Maine’s proposed legislation is based. The federal broadband privacy rules were reversed by a controversial Congressional vote under the Congressional Review Act in April 2017.

Speaking in support of the Maine bill, L.D. 946, Sohn noted the unparalleled access broadband providers have to their customers’ private information:

“Broadband providers receive, store and use a vast amount of consumer information, including sensitive information. As the FCC found in 2016, a broadband provider “sits at a privileged place in the network, the bottleneck between the customer and the rest of the network….”  This gatekeeper position allows them to see every packet that a consumer sends and receives over the Internet while on the network, including its contents.

The FCC’s record showed that only three companies have third party tracking capabilities across more than 10 percent of the top one million websites, and none of those has access to more than approximately 25 percent of web pages. In contrast, a broadband provider sees 100 percent of a customers’ unencrypted Internet traffic.  

Broadband providers also see all the encrypted traffic over their networks. Though they do not see the contents of these packets, they see when and how long a person is watching TV, visiting a website, turning on the lights, or using other devices. In addition, because broadband Internet access services are paid services, the broadband provider has the subscriber’s name, address, phone number and billing history. This gives them a uniquely detailed and comprehensive view of their customers.” 

L.D. 946 would prohibit a provider of broadband Internet access service from using, selling or sharing customers’ personal information without express consent, and requires providers to take reasonable measures to protect customers’ personal information.

Service providers’ practices have received increased attention since a January 2019 investigation by Vice Motherboard reported that AT&T, T-Mobile and Sprint sold customers’ geolocation data to data brokers, who then marketed it to bail bond companies and other third parties without customers’ knowledge or consent.

Referencing Congress’s repeal of the 2016 broadband privacy rules and recent actions by the FCC that have reduced the FCC’s authority over broadband internet access services, Sohn noted:

“When the federal government abdicates its responsibility to protect consumers, the states must step in. 

Broadband providers complain that if every state were to pass a similar law, they will be forced to comply with a “patchwork” of different consumer privacy protections, and that a federal framework would be preferable. I have little sympathy for an industry that was the driving force in convincing Congress to repeal the existing federal broadband privacy framework – the FCC’s 2016 rules - and then performed an encore by pushing the FCC to abdicate its oversight over broadband. . .

The solution to the alleged “patchwork” problem is for the companies to comply with the highest level of privacy protection a state requires. 

You can read Gigi’s full testimony here.

April 15, 2019 - Professor Neel Sukhatme Named Thomas Edison Visiting Scholar at U.S. Patent & Trademark Office

Georgetown Law professor and Tech Institute Faculty Advisor Neel Sukhatme was named the Thomas Alva Edison Distinguished Scholar at the U.S. Patent and Trademark Office today.

The Edison Distinguished Scholars are senior scholars and experts in law, economics, and related fields. They are invited to the Patent & Trademark Office to pursue their research on a wide range of topics related to IP and IP policy. They also advise policy makers on matters close to their areas of expertise. Georgetown Law professor Jay Thomas served as the inaugural Edison Scholar in 2012.

Professor Sukhatme is an Associate Professor at the Law School, where he teaches classes on patent law and empirical methods. He received his Ph.D. in Economics from Princeton University, where he was awarded the 2014 Towbes Prize for Outstanding Teaching, and his J.D. from Harvard Law School. Professor Sukhatme received his Bachelor’s Degree in Computer Engineering with a minor in Mathematics from the University of Illinois.

Professor Sukhatme’s research focuses on empirical patent law and law and economics. He teaches Property, Patent Law, Corporate Finance, and Empirical Analysis for Lawyers and Policymakers, and he co-directs the Georgetown Law and Economics Workshop series.

Some of his forthcoming and recent work includes:

Congratulations Neel!

April 9-10, 2019 - Georgetown Law Professors Testify at FTC Hearing on Consumer Privacy

Four Georgetown Law faculty will testify this week at the FTC’s hearings on approaches to consumer privacy. Associate Dean Paul Ohm, Professor David Vladeck, the Privacy Center’s Executive Director Laura Moy and Adjunct Professor Marc Groman join a mix of consumer advocates, academics, industry voices and other experts in a two-day hearing, part of the FTC’s series of hearings on Protecting Consumers and Competition in the 21st Century.

In the day’s opening panel, Associate Dean Paul Ohm joined Neil Chilson, former FTC Chief Technology Officer, and Alistair Mactaggart, chairman of Californians for Consumer Privacy, to discuss the goals of privacy protection. In his remarks, Ohm challenged the FTC to think about the harms it is positioned to address — not just those it has historically addressed. He predicted an increasing need for the agency and other enforcers to focus on “dark patterns”, tricks used to manipulate users into clicking buttons or selecting options they wouldn’t otherwise choose. Bipartisan legislation on dark patterns was introduced by Senator Mark Warner and Deb Fischer today.

Ohm also referenced his prior calls for regulation that adjusts based on the scale of various companies, noting that companies with millions of customers should be held to higher standards than those with a small user base. Ohm wrote about that theory in the Georgetown Law Technology Review 2018 Symposium issue, available here.

Later in the day, the Privacy Center’s Laura Moy joined a panel focused on current approaches to privacy protection. The panel noted the benefits and drawbacks of various privacy frameworks, with Moy noting that traditional focus on individual harms fail to account for societal harms, including discriminatory advertising, amplification of hate speech, misinformation and disinformation. Referencing current discussions about federal privacy legislation, she emphasized that a strong patchwork of state laws will better protect consumers than a weak federal standard.

The hearings continue tomorrow, with Georgetown professor David Vladeck, former head of the FTC’s Bureau of Consumer Protection, and adjunct professor Marc Groman, former head of privacy in the Office of Management Budget joining a full line-up. The livestream is available at https://competition-consumer-protection-hearings.videoshowcase.net/.

Good coverage of the hearings is available on Twitter at #ftchearings.

The FTC’s James Cooper, former FTC Chief Technology Officer Neil Chilson, Alastair Mactaggart and Georgetown’s Paul Ohm speak at the FTC’s hearing on consumer privacy

The FTC’s James Cooper, former FTC Chief Technology Officer Neil Chilson, Alastair Mactaggart and Georgetown’s Paul Ohm speak at the FTC’s hearing on consumer privacy

April 4, 2019 - Senator Ed Markey, FTC and FCC Commissioners Headline Event with Common Sense Media on Children in the Digital Age

This week, the Institute is partnering with Common Sense Media and Georgetown’s Children’s Digital Media Center on a conference about the present and future state of children’s digital well-being.

Among other speakers, the conference features Senator Ed Markey, FTC Commissioner Rohit Chopra, FCC Commissioner Jessica Rosenworcel, former Surgeon General Dr. Vivek Murthy, and Cameron Kasky, a survivor of the mass shooting at Marjory Stoneman Douglas High School and co-founder of March for Our Lives.

Georgetown Law’s David Vladeck and Angela Campbell will speak on panels about children’s privacy and competition policy, respectively. Mr Kasky will discuss the power of technology to mobilize a movement, drawing on his experience organizing the 2018 March for Our Lives, a student-led demonstration in support of stronger gun protection measures, with almost 900 partner demonstrations across the United States and around the world.

If you missed the event in person, a recording of the event will soon be available here.

Truth About Tech: Solutions for Digital Well-Being

April 4, 2019

9am - 5pm

Georgetown University School of Continuing Studies, 640 Mass Ave NW

  • 9:15 a.m.: Welcome 

  • 9:30–10 a.m.: Opening Remarks

    • Hon. Ed Markey, U.S. senator of Massachusetts

  • 10–10:45 a.m.: Staying Connected: Tech and Social Relationships

    • Dr. Vivek Murthy, 19th surgeon general of the United States

    •  in conversation with James P. Steyer, CEO and founder, Common Sense

  • 10:45–11 a.m.: Break

  • 11–11:15 a.m.: Enforcing COPPA: Are We Protecting Kids' Privacy?

    • Rohit Chopra, FTC commissioner

  • 11:15–12 p.m.: You Are the Product: The High Cost of a Free Internet

    • Franklin Foer, The Atlantic

    • Nicol Turner-Lee, Brookings Institution

    • David Vladeck, Georgetown University Law Center

    • Cecilia Kang, New York Times (moderator)

  • 12–12:45 p.m.: Lunch

  • 12:45–1:10 p.m.: Holding Tech Accountable

    • Hon. Karl Racine, Attorney General of Washington, D.C.

  • 1:10–1:45 p.m.: Building Movements: Mobilizing the Power of Tech

    • Cameron Kasky, co-founder, March for Our Lives

    • in conversation with Elizabeth Galicia, Common Sense

  • 1:45–2:30 p.m.: Future Tech: Raising Kids in the AI Age

    • Jakki Bailey, University of Texas

    • Sandra Calvert, Georgetown University

    • Justine Cassell, Carnegie Mellon University

    • Michael Robb, Common Sense (moderator)

  • 2:30–2:45 p.m.: Break

  • 2:45–3:30 p.m.: Trust & Tech: Disrupting Monopolies

    • Roger McNamee, co-founder of Elevation Partners

    • Barry Lynn, Open Markets Institute 

    • Angela Campbell, Georgetown University Law Center (moderator)

  • 3:30–3:45 p.m.: Closing the Homework Gap

    • Jessica Rosenworcel, FCC commissioner

  • 3:45–4:30 p.m.: Digital Equity: Ensuring Access and the Making of Digital Citizens

    • Rachel Barr, Georgetown University

    • Lisa Guernsey, New America

    • Tina Plaza-Whoriskey, Child Trends

    • Amina Fazlullah, Common Sense (moderator)

  • 5:00 p.m.: Closing


March 25, 2019 - Institute Hosts Event on Algorithmic Bias & the Digital Divide

On Monday, the Tech Institute hosted a panel event on Algorithmic Exclusion: How Data Deserts in the U.S. Perpetuate Inequity. The event drew an important connection between the lack of connectivity for marginalized communities in rural and low income areas, and increasing concerns about bias in the algorithms that impact so many aspects of our lives.

Lack of connectivity hurts students trying to do their homework, jobseekers looking for work, and communities engaging in online discourse. But what about its impact on the fairness & equity of AI?

The digital divide dramatically exacerbates inequity in our society: nearly half of all people in the U.S. without home internet access are people of color. Six in 10 rural residents say high speed internet access is a problem in their area.

At a time when algorithms shape every facet of our lives—from how government resources are allocated, to the products and information you see online—disparities in online access risk perpetuating exclusion for communities of color, low-income communities and rural America.

Representative Brenda Lawrence (D-MI) gave remarks at the event, emphasizing that bridging the digital divide should be a issue of national priority. The panelists included a mix of consumer advocates, industry representatives and data scientists.

Thanks to our partners Public Knowledge and The Goodfriend Group for collaborating with us on this event. You can view the video and full details at www.georgetowntech.org/datadeserts.