June 12, 2019 - Institute Fellow Deloris Wilson Selected as 2019 Atlantic Fellow for Racial Equity

Congratulations to our fellow Deloris Wilson, who was just selected as a 2019 Atlantic Fellow for Racial Equity!

The Atlantic Fellows program brings together leaders on the frontlines of struggle for racial equity in South Africa and the U.S. to explore, imagine, experiment and build long-term solutions for impactful change. The inaugural cohort of 19 fellows includes authors, community organizers, documentary filmmakers, faith leaders, scientists and other leaders who work to challenge anti-Black racism and build the policies, institutions and narratives needed for a more equitable future.

For two years, Deloris has led the Tech Institute’s work building BEACON: The D.C. Women Founders’ Initiative, a community-led initiative to make D.C. a leading hub for diverse women entrepreneurs. In partnership with the D.C. Mayor’s Office, Google, and a broad coalition of community leaders, BEACON studies gaps in the ecosystem supporting women entrepreneurs and works to fill them by mobilizing businesses, government and the community.

Among other projects, BEACON runs an annual grant program that funds innovative projects to support D.C.’s women entrepreneurs. It operates D.C.’s largest directory of women-owned businesses and resources for women entrepreneurs. BEACON hosts or co-hosts regular programming, leads campaigns to address issues affecting women entrepreneurs, identifies speaking and vending opportunities for women business owners, and produces a biweekly newsletter showcasing resources and opportunities. BEACON is particularly focused on elevating the voices of women entrepreneurs of color, who are historically underfunded and under-supported despite being the largest group of women-owned business owners in the District.

From 2016-2019, BEACON was incubated at the Tech Institute, where we conducted strategic and operational work to build the network, and led academic research on ways to improve D.C.’s ecosystem for diverse women founders. In that time, BEACON has grown to serve over 3,000 entrepreneurs, funded 30 community organizations, and shared its findings locally and nationally. Our academic work culminated in Deloris’s 2018 report, “Building Inclusive Ecosystems with Intentionality: A Strategy to Enhance Support for D.C.’s Women Founders.

Despite BEACON’s strong roots in D.C., our research into building effective support systems for diverse entrepreneurship has wide application. We’ve presented our work before the National Women’s Business Council, the Senate Committee on Small Business and Entrepreneurship, the United Nations Foundation, the U.S. Chamber of Commerce Foundation, and other national organizations. The Tech Institute’s work on BEACON was generously supported by a philanthropic gift from Google.

After two and a half years of incubation, we’re proud that BEACON has now grown into an independent 501(c)(3) organization, led by D.C.-based entrepreneurs and community leaders and supported by an Advisory Board of leading entrepreneurs, VCs, service providers, policy experts and more. Deloris has become BEACON’s Chief Strategy Officer. Our Executive Director, Alexandra Givens, remains deeply involved on BEACON’s board. You can read more about BEACON’s ongoing work here.

We are deeply proud of our work building BEACON, and doubly so that Deloris will continue her passionate leadership on racial justice issues both with BEACON and through her Atlantic Fellowship. Congratulations Deloris!

June 6, 2019 - Announcing the Iron Tech Lawyer Invitational!

Are you a student or professor interested in creating legal tech or data science solutions to help the public interest? Read about the new Iron Tech Lawyer Invitational, which we’re proudly announcing today!

ITL Invitational.png

Georgetown Law’s Institute for Technology Law & Policy and the Justice Lab at Georgetown Law are pleased to announce the inaugural Iron Tech Lawyer Invitational, a national competition for student-created legal tech solutions that help bridge the justice gap.

Student teams from qualifying universities are invited to a one-day pitch competition in Washington, D.C. to showcase a legal tech or data analysis tool they have developed for a pro bono organizational client.

The students must complete the work in an academic course, clinic, or supervised independent study during the 2019-2020 academic year. Client organizations can include legal services organizations or other non-profits focused on assisting people with civil legal problems.

Qualifying universities are invited to submit one student project for the Invitational. Student teams will travel to Washington, D.C. during the week of April 12, 2020 to present their projects at the Iron Tech Student Invitational, hosted at Georgetown Law.


Projects will be evaluated by a panel of experts in access to civil justice, legal design and technology. The winning team will be awarded $5,000 in funding support to advance or complete their technology or data science solution.


The Iron Tech Lawyer Invitational is designed to encourage the creation of academic courses focused on the thoughtful development of technology and data-driven solutions to help improve the civil justice system. Student teams must be supported by a professor, and complete their project in an academic course, clinic or independent study.

Professors who are interested in sending a student team to the Invitational must meet the following criteria:

  • The students must complete a student project in an academic program, i.e. in an academic course, clinic, or supervised independent study.

  • The student project must involve the creation of a technology tool or data project that strengthens legal service delivery or otherwise improves access to the civil legal system.

  • The student project must be developed for a “client” that is a non-profit legal services provider or other non-profit that assists people with their civil legal problems.

  • The student project must be supported by a faculty sponsor, such as the teacher of the course or supervisor of an independent study.

  • The student project must be completed during the 2019-2020 academic year.

  • Only one student project may be submitted per university.


(1) Faculty Interest Form. Professors who are interested in sending a student team to the Iron Tech Invitational must complete a School Interest Form by July 15, 2019. (This form may not be submitted by students). The Interest Form is intended as a high-level expression of interest; you do not yet need to select which students you will send, and client organizations and specific projects need not be identified.

(2) Follow-Up. Professors who have submitted a School Interest Form will be contacted by the organizers to discuss the competition, shared pedagogical goals, and eligibility.

(3) Faculty Application. Professors must submit a School Application to secure a slot for one student project from their university. At this stage, client organizations and specific student projects must be identified. If the professor is supervising multiple student projects, they need not have selected which student project they will send to the Invitational. We expect that many professors will run their own mini-Iron Tech Competition within their class, program, or between courses offered within their university to select which student team proceeds to the Invitational.

(4) Selection of Qualifying Schools. The Invitational’s organizers will select 5-8 qualifying universities who will send a student project of the university’s choosing to the Invitational.

(5) Submission of Final Student Projects. Qualifying universities will notify the Invitational’s organizers of the student project they have chosen to represent them, and submit a link to the final project and supporting documentation.


Projects will be judged by a panel of experts in access to justice, legal design and technology. Award criteria will include:

  • Usefulness

  • Completeness

  • Ambition & Creativity

  • Design

  • Student/Team Presentation


Professors are responsible for identifying and securing client organization(s) for the student projects, and are solely responsible for the relationship with any client organization. Schools must provide the necessary software to develop the application.

The organizers make no representation as to the accuracy, or suitability for use, of student projects submitted to the Iron Tech Invitational. Projects are not the work product of Georgetown Law or the organizers. The organizers reserve the right to amend the application or competition rules, and will provide notice to applicants of any changes.



See our FAQs, or email TechInstitute@law.georgetown.edu


The Iron Tech Invitational is made possible by generous support from the Bigglesworth Family Foundation

Screen Shot 2019-05-31 at 2.57.52 PM.png

The Iron Tech Invitational is made possible by generous support from the Bigglesworth Family Foundation

June 3, 2019 - Georgetown's Tech & Communications Clinic Files FTC Comment on Protecting Children's Privacy

Students in Georgetown’s IPR Tech & Communications Clinic today filed comments with the FTC as part of the agency’s hearing on Competition and Consumer Protection in the 21st Century: The FTC’s Approach to Consumer Privacy.

The comments, filed on behalf of clinic clients the Center for Digital Democracy and Center for Commercial Free Childhood, focused on protecting the privacy of children and teens:

“While the Children’s Online Privacy Protection Act (COPPA) is intended to protect the privacy of children under age 13, it is no longer up to the task.

“COPPA’s underlying assumption is that parents will be able to protect their children’s privacy if companies give notice of their privacy practices and do not collect personal information until unless the parent gives consent. But this no longer works. Most parents do not read privacy policies, and even if they do, many do not provide the information needed for informed consent.

Given the unprecedented amount of data being collected, the sophistication of data mining techniques, and the lack of transparency, most people lack a sufficient understanding of scope of the data collected and how it could be used. Moreover, because the FTC has not effectively enforced COPPA, many companies feel free to ignore COPPA’s requirements.”

The clinic’s comment elaborates the many ways in which new technologies have made it harder to protect children’s privacy in the 20 years since the Children’s Online Privacy Protection Act (COPPA) was enacted. Digital technologies have become a pervasive presence in the life of children and teens, and massive-scale data-mining and other techniques have made “informed consent” an unworkable regime to protect users’ privacy.

The comment notes shortcomings in the FTC’s current enforcement of COPPA, arguing that the FTC rarely brings enforcement actions under COPPA even though many companies fail to comply with COPPA requirements. It urges the FTC to make public the information provided by “safe harbor organizations”, which exist to certify compliance with COPPA but often do not rigorously enforce their guidelines.

The comment argued that new legislation is also needed:

“New legislation, such as the bi-partisan Markey-Hawley bill, is needed to address COPPA’s short comings.

Any legislation must include developmentally-appropriate protections for teens, because COPPA only covers children under age 13. The legislation should also prohibit practices that may be harmful to children, rather than requiring parents to read and try to understand the impact of multiple privacy policies.

“Until such legislation is passed, however, the FTC can and should do more to better protect children’s privacy. Specifically, we urge the FTC to undertake more enforcement actions, to enforce COPPA’s notice requirements, and to fix problems with the COPPA safe harbor program.”

The IPR Tech & Communications Clinic has for decades been one of the most active watchdogs tracking enforcement of children’s privacy laws.

In December 2018, the clinic filed a complaint with the FTC requesting an investigation into whether Google's marketing of apps directed to children in the Google Play Store violates the FTC Act’s prohibition on deceptive and unfair practices.

In October 2018, the clinic partnered with the Institute in hosting the conference “COPPA at 20: Protecting Children’s Privacy in the Digital Age.”

The FTC’s hearing is part of an extensive series of hearings the FTC is convening on Competition and Consumer Protection in the 21st Century. The agency’s opening hearing took place at Georgetown Law in September 2018. Georgetown faculty and fellows have testified at a number of the hearings, including the April 9-10 FTC hearing on consumer privacy in association with which today’s comments were filed.

You can read the clinic’s full comment here.

May 31, 2019 - Institute Announces New Project on Algorithmic Fairness for People with Disabilities

By Alexandra Reeve Givens, Executive Director

I’m proud to announce the launch of a new multi-year project at Georgetown’s Institute for Tech Law & Policy on algorithmic fairness and the rights of people with disabilities.

Supported by the Ford Foundation, the project is designed to analyze the impact of algorithmic decision-making on people with disabilities—in employment, benefits determinations, and other settings where AI-driven decision-making touches people’s lives. The project will assess specific areas of risk, analyze gaps in existing legal and policy protections, and forge cross-disciplinary collaborations to center the perspectives of people with disabilities in efforts to develop algorithmic fairness and accountability. You can read more about our approach below.

Our Goal

This project seeks to add an important contribution to the growing conversation around fairness, accountability and transparency in machine learning. Despite increasing focus on ethics in AI, few AI scholars or policy experts are considering the unique risks and impacts of algorithmic decision-making for the millions of individuals affected by disability. In turn, disability rights advocates and regulators are just starting to consider how machine learning may impact this community.

This troubling gap exists even though people with disabilities are, in many ways, disproportionately vulnerable to the threats of algorithmic bias. Accommodation requirements, gaps in an individual’s employment history, or the need for flexible work shifts may all cause automated systems to penalize individuals. Programs that evaluate employees based on sentiment analysis may down-rate those whose expressions vary from an algorithm’s perceived “norm”. Job screening programs that rely on timed answers may penalize candidates who rely on assistive technologies. These are but a handful of examples.

While the emerging literature around AI bias gives some useful context for disability rights, the legal and policy framework for people with disabilities requires specific study. For example, the Americans with Disabilities Act (ADA) requires employers to adopt “reasonable accommodations” for qualified individuals with a disability. But what is a “reasonable accommodation” in the context of machine learning and AI? How will the ADA’s unique standard interact with case law and scholarship about AI bias against other protected groups? When the ADA governs what questions employers can ask about a candidate’s disability, how should we think about inferences from data the employer otherwise collects?

The Institute’s program on algorithmic fairness for people with disabilities seeks to address these questions. In partnership with other civil society organizations and our Project Advisory Committee, it will foster collaborative engagement, conduct legal and policy analysis, and produce materials to shape employer practices, inform potential enforcement actions, and empower individuals to know their rights.

Our Approach

Our initial step is to hire a Program Director who brings deep experience in disability rights. We are eager to find a talented leader who can direct the project’s research agenda, lead stakeholder engagement, and oversee and execute the project work in collaboration with me and the Institute’s other team members. Individuals who are personally affected by disability are particularly encouraged to apply. The hiring notice is available here. Please share it widely!

Our second step is to continue developing our Project Advisory Committee, which is currently in formation. The Project Advisory Committee consists of disability rights experts, including those living with disability themselves, AI experts and other individuals who will help inform the project work. The Project Advisory Committee will be finalized later this year, after the Program Director has been hired and has had an opportunity to weigh in. If you have recommendations for the Project Advisory Committee or would like to be considered, please contact us using this form.

Finally, this project will operate through extensive collaboration with other stakeholders, including individuals and organizations focused on disability rights, those at the intersection of disability rights and the rights of other marginalized communities, and those working on algorithmic fairness and accountability. The project’s key goal is to foster information exchange and knowledge sharing between these communities, with an eye to prioritizing issues, inspiring collaborations, and developing actionable work. We are committed to a consciously intersectional approach, working to center the experience of multiply-marginalized communities and advance equity for the most marginalized individuals. 

Join Us!

Are you an experienced disability rights lawyer who cares about these issues - or do you know someone who is? Please review the Job Posting for our Program Director and apply! Applications will be considered on a rolling basis starting June 3, 2019.

Do you have thoughts about this project, or would you like to keep up to date on the latest developments? Please join our mailing list or contact us using this form.

We look forward to engaging with you in this work.

May 21, 2019 - Georgetown Hosts Biennial Conference of the Partnership for Progress on the Digital Divide

This week, the Institute is hosting the biennial conference for the Partnership for Progress on the Digital Divide (PPDD).

The international conference will feature keynote speakers Vint Cerf, widely known as one of the “Fathers of the Internet”, Larry Irving, former Head of the National Telecommunications Infrastructure Administration (NTIA), and policymakers from around the globe.

On Wednesday at 9.30 a.m., FCC Commissioner Geoffrey Starks will give his first speech since being confirmed as a Commissioner, followed by a fireside chat with Gigi Sohn, now a Distinguished Fellow at the Institute.

Later in the week, the Institute’s Executive Director, Alexandra Givens, will host a panel on Algorithmic Bias and the Digital Divide. Other panels will cover the recently-introduced Digital Equity Act, the 2020 Census, the use of Open Source Software to Increase Digital Engagement, and a wide range of other subjects.

A write-up of the conference is available here. The full list of speakers is available at www.ppdd.org/conferences/ppdd2019.

Founded in 2002, PPDD addresses the digital divide, referring to the still-existing gap between populations who have access to information and communications technology, and those who lack access. Currently the the only academic professional organization in the world focused on these issues, PPDD brings together policymakers, academics and practitioners who work on closing the digital divide and address the many other challenges and opportunities presented by the digital age.


The conference will take place from May 22-24, 2019 on Georgetown’s Main Campus, and will feature a number of Georgetown experts in key speaking roles, as well as practitioners and policymakers representing academia, government, industry, and the nonprofit sector.

Georgetown’s Provost, Robert Groves, will open the conference on Wednesday, May 22, along with featured panels by faculty members and practitioners from the Tech Institute, the Beeck Center for Social Impact & Innovation, the McDonough School of Business’s Center for Business & Public Policy, and other research hubs.

Georgetown’s representatives at PPDD are key leaders in the university’s new Initiative on Tech & Society, which focuses on developing innovative solutions at the intersection of ethics, policy and governance.

“As technology transforms how we apply for jobs, access opportunities, and engage with the world around us, questions of access and affordability are paramount,” said Alexandra Givens, Executive Director of the Tech Institute, and a co-host of the conference. “We all must engage on the question of how technology can be deployed to create new opportunities--not deepen existing inequality. We are proud to host this important convening at Georgetown, where many of our faculty and policy centers are deeply engaged on this key issue.”


This year’s conference at Georgetown marks the 25th anniversary of the recognition of the digital divide through social scientific research, and is set to be the largest ever worldwide gathering of experts on technology, disability, and the digital divide.

The speaking sessions of the conference will be available to the public via livestream at www.ppdd.org.

Individuals interested in attending the conference may register at www.ppdd.org.

Members of the press are requested to contact Susan Kretchmer at Susan.Kretchmer@ppdd.org.

May 16, 2019 - Privacy Center Releases Two Major Reports on the Use of Facial Recognition Technology

Today, the Georgetown Center on Privacy & Technology released two reports detailing the widespread use and misuse of face surveillance by police departments nationwide. The reports build on the Center’s groundbreaking work on face recognition, The Perpetual Line-Up, which drew national attention for its finding that half of all adult Americans have their identities used in police face recognition databases.

In the first new report, Garbage In, Garbage Out, CPT reveals that police departments regularly use altered images to generate face recognition matches when images of suspects don’t provide results. To generate leads, police departments submit blurred photos, parts of faces swapped with other photos, and in some cases, submit photos of celebrities that investigators believe look similar to suspects. The report explains that most police departments do not have any prohibitions on using face surveillance systems in this way:

“There are no rules when it comes to what images police can submit to face recognition algorithms to generate investigative leads... The stakes are too high in criminal investigations to rely on unreliable—or wrong—inputs… Unfortunately, police departments' reliance on questionable probe photos appears all too common.”

The second report, America Under Watch, documents the widespread deployment of face surveillance systems in major U.S. cities, such as Chicago, New York, and Washington. The report demonstrates that major police departments have the ability to “scan live video from cameras located at businesses, health clinics, schools, and apartment buildings.” Even departments that claim to not use face surveillance have “paid to acquire and maintain the technology for years.”

The reports come as policymakers and industry are grappling with the reality of face surveillance. Earlier this week, the City of San Francisco voted to ban the use of facial recognition technology by police and city agencies. Microsoft, itself a developer of facial recognition systems, has called for regulations of the technology.

The reports are the work of the Privacy Center’s Clare Garvie (L’16) and Executive Director Laura Moy.

Garvie will testify about the report at a hearing on Wednesday May 22nd before the U.S. House of Representatives’ Committee on Oversight and Reform.

Read the reports at FlawedFaceData.com and AmericaUnderWatch.com. And for a robust conversation on Twitter, check out @ClareAngelyn, @LauraMoy, and @GeorgetownCPT.

April 24, 2019 - Georgetown Students Showcase Legal Apps Created to Improve Access to Justice

Today, our Tenth Iron Tech Lawyer Competition showcased apps created by Georgetown Law students to improve access to justice.

Created in our Georgetown Law class on Technology, Innovation and Law, the apps addressed a range of social and legal challenges with innovative new approaches. Students presented their work in a final pitch competition attended by experts from the access to justice community, with over 200 viewers casting votes online.

The winning app was the Whitman-Walker Health D.C. Name Change App, which helps D.C. adults who identify as transgender or gender expansive to change their legal names. The app transforms a lengthy, confusing process into a convenient, time-saving experience by populating three forms: a Name Change Petition, a Fee Waiver Request, and a Motion to Seal their Petition.

The award for excellence in design went to The Legal Check Up app, designed for Georgetown’s Health Justice Alliance to identify potential legal concerns for cancer patients in an attempt to provide a more comprehensive approach to healthcare.

The social media prize (for greatest popular vote) went to Navi-Gator, which helps people required to register after conviction determine how to comply with Illinois law. The app features a GPS-driven map that determines where such a person can live, work, and travel, and connects them to resources and a user community.

The class’s other exceptional projects were:

You can read more about the apps and watch the student presentations at www.irontechlawyer.com, and on Twitter following @GtwnIronTech and #IronTechLawyer.

Students, faculty and judges at the 2019 Iron Tech Lawyer competition

Students, faculty and judges at the 2019 Iron Tech Lawyer competition

April 24, 2019 - Institute Fellow Gigi Sohn Testifies Before Maine Legislature on Broadband Privacy

Our Institute Fellow Gigi Sohn testified today before the Maine Legislature’s Joint Committee on Energy, Utilities and Technology in support of legislation that would require broadband internet access providers to protect their customers’ privacy.

Sohn, who has been a public interest advocate in technology policy for over 30 years, was a Counselor to former FCC Chairman Tom Wheeler when the FCC adopted the 2016 Broadband Privacy Rules, on which Maine’s proposed legislation is based. The federal broadband privacy rules were reversed by a controversial Congressional vote under the Congressional Review Act in April 2017.

Speaking in support of the Maine bill, L.D. 946, Sohn noted the unparalleled access broadband providers have to their customers’ private information:

“Broadband providers receive, store and use a vast amount of consumer information, including sensitive information. As the FCC found in 2016, a broadband provider “sits at a privileged place in the network, the bottleneck between the customer and the rest of the network….”  This gatekeeper position allows them to see every packet that a consumer sends and receives over the Internet while on the network, including its contents.

The FCC’s record showed that only three companies have third party tracking capabilities across more than 10 percent of the top one million websites, and none of those has access to more than approximately 25 percent of web pages. In contrast, a broadband provider sees 100 percent of a customers’ unencrypted Internet traffic.  

Broadband providers also see all the encrypted traffic over their networks. Though they do not see the contents of these packets, they see when and how long a person is watching TV, visiting a website, turning on the lights, or using other devices. In addition, because broadband Internet access services are paid services, the broadband provider has the subscriber’s name, address, phone number and billing history. This gives them a uniquely detailed and comprehensive view of their customers.” 

L.D. 946 would prohibit a provider of broadband Internet access service from using, selling or sharing customers’ personal information without express consent, and requires providers to take reasonable measures to protect customers’ personal information.

Service providers’ practices have received increased attention since a January 2019 investigation by Vice Motherboard reported that AT&T, T-Mobile and Sprint sold customers’ geolocation data to data brokers, who then marketed it to bail bond companies and other third parties without customers’ knowledge or consent.

Referencing Congress’s repeal of the 2016 broadband privacy rules and recent actions by the FCC that have reduced the FCC’s authority over broadband internet access services, Sohn noted:

“When the federal government abdicates its responsibility to protect consumers, the states must step in. 

Broadband providers complain that if every state were to pass a similar law, they will be forced to comply with a “patchwork” of different consumer privacy protections, and that a federal framework would be preferable. I have little sympathy for an industry that was the driving force in convincing Congress to repeal the existing federal broadband privacy framework – the FCC’s 2016 rules - and then performed an encore by pushing the FCC to abdicate its oversight over broadband. . .

The solution to the alleged “patchwork” problem is for the companies to comply with the highest level of privacy protection a state requires. 

You can read Gigi’s full testimony here.

April 15, 2019 - Professor Neel Sukhatme Named Thomas Edison Visiting Scholar at U.S. Patent & Trademark Office

Georgetown Law professor and Tech Institute Faculty Advisor Neel Sukhatme was named the Thomas Alva Edison Distinguished Scholar at the U.S. Patent and Trademark Office today.

The Edison Distinguished Scholars are senior scholars and experts in law, economics, and related fields. They are invited to the Patent & Trademark Office to pursue their research on a wide range of topics related to IP and IP policy. They also advise policy makers on matters close to their areas of expertise. Georgetown Law professor Jay Thomas served as the inaugural Edison Scholar in 2012.

Professor Sukhatme is an Associate Professor at the Law School, where he teaches classes on patent law and empirical methods. He received his Ph.D. in Economics from Princeton University, where he was awarded the 2014 Towbes Prize for Outstanding Teaching, and his J.D. from Harvard Law School. Professor Sukhatme received his Bachelor’s Degree in Computer Engineering with a minor in Mathematics from the University of Illinois.

Professor Sukhatme’s research focuses on empirical patent law and law and economics. He teaches Property, Patent Law, Corporate Finance, and Empirical Analysis for Lawyers and Policymakers, and he co-directs the Georgetown Law and Economics Workshop series.

Some of his forthcoming and recent work includes:

  • Neel U. Sukhatme & Son Le, Reaching for Mediocrity: Competition and Stagnation in Pharmaceutical Innovation (working paper) [Gtown Law]

  • Neel U. Sukhatme & M. Gregg Bloche, Health Care Costs and the Arc of Innovation, 104 Minn. L. Rev. (forthcoming) [Gtown Law]

  • Neel U. Sukhatme & Ofer Eldar, Will Delaware Be Different? An Empirical Study of TC Heartland and the Shift to Defendant Choice of Venue, 104 Cornell L. Rev. 101-163 (2018).


  • Neel U. Sukhatme & Erik Hovenkamp, Vertical Mergers and the MFN Thicket in Television, Antitrust Chron., Aug. 2018, at 1-8. [WWW]

  • Neel U. Sukhatme, "Loser Pays" in Patent Examination, 54 Hous. L. Rev. 165-208 (2016). [WWW]

  • Neel U. Sukhatme, Regulatory Monopoly and Differential Pricing in the Market for Patents, 71 Wash. & Lee L. Rev. 1855-1922 (2014). [WWW]

Congratulations Neel!

April 9-10, 2019 - Georgetown Law Professors Testify at FTC Hearing on Consumer Privacy

Four Georgetown Law faculty will testify this week at the FTC’s hearings on approaches to consumer privacy. Associate Dean Paul Ohm, Professor David Vladeck, the Privacy Center’s Executive Director Laura Moy and Adjunct Professor Marc Groman join a mix of consumer advocates, academics, industry voices and other experts in a two-day hearing, part of the FTC’s series of hearings on Protecting Consumers and Competition in the 21st Century.

In the day’s opening panel, Associate Dean Paul Ohm joined Neil Chilson, former FTC Chief Technology Officer, and Alistair Mactaggart, chairman of Californians for Consumer Privacy, to discuss the goals of privacy protection. In his remarks, Ohm challenged the FTC to think about the harms it is positioned to address — not just those it has historically addressed. He predicted an increasing need for the agency and other enforcers to focus on “dark patterns”, tricks used to manipulate users into clicking buttons or selecting options they wouldn’t otherwise choose. Bipartisan legislation on dark patterns was introduced by Senator Mark Warner and Deb Fischer today.

Ohm also referenced his prior calls for regulation that adjusts based on the scale of various companies, noting that companies with millions of customers should be held to higher standards than those with a small user base. Ohm wrote about that theory in the Georgetown Law Technology Review 2018 Symposium issue, available here.

Later in the day, the Privacy Center’s Laura Moy joined a panel focused on current approaches to privacy protection. The panel noted the benefits and drawbacks of various privacy frameworks, with Moy noting that traditional focus on individual harms fail to account for societal harms, including discriminatory advertising, amplification of hate speech, misinformation and disinformation. Referencing current discussions about federal privacy legislation, she emphasized that a strong patchwork of state laws will better protect consumers than a weak federal standard.

The hearings continue tomorrow, with Georgetown professor David Vladeck, former head of the FTC’s Bureau of Consumer Protection, and adjunct professor Marc Groman, former head of privacy in the Office of Management Budget joining a full line-up. The livestream is available at https://competition-consumer-protection-hearings.videoshowcase.net/.

Good coverage of the hearings is available on Twitter at #ftchearings.

The FTC’s James Cooper, former FTC Chief Technology Officer Neil Chilson, Alastair Mactaggart and Georgetown’s Paul Ohm speak at the FTC’s hearing on consumer privacy

The FTC’s James Cooper, former FTC Chief Technology Officer Neil Chilson, Alastair Mactaggart and Georgetown’s Paul Ohm speak at the FTC’s hearing on consumer privacy

April 4, 2019 - Senator Ed Markey, FTC and FCC Commissioners Headline Event with Common Sense Media on Children in the Digital Age

This week, the Institute is partnering with Common Sense Media and Georgetown’s Children’s Digital Media Center on a conference about the present and future state of children’s digital well-being.

Among other speakers, the conference features Senator Ed Markey, FTC Commissioner Rohit Chopra, FCC Commissioner Jessica Rosenworcel, former Surgeon General Dr. Vivek Murthy, and Cameron Kasky, a survivor of the mass shooting at Marjory Stoneman Douglas High School and co-founder of March for Our Lives.

Georgetown Law’s David Vladeck and Angela Campbell will speak on panels about children’s privacy and competition policy, respectively. Mr Kasky will discuss the power of technology to mobilize a movement, drawing on his experience organizing the 2018 March for Our Lives, a student-led demonstration in support of stronger gun protection measures, with almost 900 partner demonstrations across the United States and around the world.

If you missed the event in person, a recording of the event will soon be available here.

Truth About Tech: Solutions for Digital Well-Being

April 4, 2019

9am - 5pm

Georgetown University School of Continuing Studies, 640 Mass Ave NW

  • 9:15 a.m.: Welcome 

  • 9:30–10 a.m.: Opening Remarks

    • Hon. Ed Markey, U.S. senator of Massachusetts

  • 10–10:45 a.m.: Staying Connected: Tech and Social Relationships

    • Dr. Vivek Murthy, 19th surgeon general of the United States

    •  in conversation with James P. Steyer, CEO and founder, Common Sense

  • 10:45–11 a.m.: Break

  • 11–11:15 a.m.: Enforcing COPPA: Are We Protecting Kids' Privacy?

    • Rohit Chopra, FTC commissioner

  • 11:15–12 p.m.: You Are the Product: The High Cost of a Free Internet

    • Franklin Foer, The Atlantic

    • Nicol Turner-Lee, Brookings Institution

    • David Vladeck, Georgetown University Law Center

    • Cecilia Kang, New York Times (moderator)

  • 12–12:45 p.m.: Lunch

  • 12:45–1:10 p.m.: Holding Tech Accountable

    • Hon. Karl Racine, Attorney General of Washington, D.C.

  • 1:10–1:45 p.m.: Building Movements: Mobilizing the Power of Tech

    • Cameron Kasky, co-founder, March for Our Lives

    • in conversation with Elizabeth Galicia, Common Sense

  • 1:45–2:30 p.m.: Future Tech: Raising Kids in the AI Age

    • Jakki Bailey, University of Texas

    • Sandra Calvert, Georgetown University

    • Justine Cassell, Carnegie Mellon University

    • Michael Robb, Common Sense (moderator)

  • 2:30–2:45 p.m.: Break

  • 2:45–3:30 p.m.: Trust & Tech: Disrupting Monopolies

    • Roger McNamee, co-founder of Elevation Partners

    • Barry Lynn, Open Markets Institute 

    • Angela Campbell, Georgetown University Law Center (moderator)

  • 3:30–3:45 p.m.: Closing the Homework Gap

    • Jessica Rosenworcel, FCC commissioner

  • 3:45–4:30 p.m.: Digital Equity: Ensuring Access and the Making of Digital Citizens

    • Rachel Barr, Georgetown University

    • Lisa Guernsey, New America

    • Tina Plaza-Whoriskey, Child Trends

    • Amina Fazlullah, Common Sense (moderator)

  • 5:00 p.m.: Closing

Schedule subject to change.

March 25, 2019 - Institute Hosts Event on Algorithmic Bias & the Digital Divide

On Monday, the Tech Institute hosted a panel event on Algorithmic Exclusion: How Data Deserts in the U.S. Perpetuate Inequity. The event drew an important connection between the lack of connectivity for marginalized communities in rural and low income areas, and increasing concerns about bias in the algorithms that impact so many aspects of our lives.

Lack of connectivity hurts students trying to do their homework, jobseekers looking for work, and communities engaging in online discourse. But what about its impact on the fairness & equity of AI?

The digital divide dramatically exacerbates inequity in our society: nearly half of all people in the U.S. without home internet access are people of color. Six in 10 rural residents say high speed internet access is a problem in their area.

At a time when algorithms shape every facet of our lives—from how government resources are allocated, to the products and information you see online—disparities in online access risk perpetuating exclusion for communities of color, low-income communities and rural America.

Representative Brenda Lawrence (D-MI) gave remarks at the event, emphasizing that bridging the digital divide should be a issue of national priority. The panelists included a mix of consumer advocates, industry representatives and data scientists.

Thanks to our partners Public Knowledge and The Goodfriend Group for collaborating with us on this event. You can view the video and full details at www.georgetowntech.org/datadeserts.

March 13, 2019 - Institute Fellow Richard Whitt Collaborates on Transatlantic Content Moderation Working Group

Institute Fellow Richard Whitt has been busy, on both sides of the Atlantic.  This past week, in Austin, Texas, he led a SXSW session entitled “Want to Fix the Web?  Equip the Users.” On March 4th, he led a workshop at Georgetown Law on his GLIAnet Project to build a more trustworthy and accountable Web.

And the prior weekend, Whitt was at historic Ditchley Park, outside London, to participate in the inaugural session of the Transatlantic High-Level Working Group on Content Moderation and Freedom of Expression.  Organized and directed by former FCC Commissioner Susan Ness, Distinguished Fellow at the Annenberg Public Policy Center, the Working Group includes European and North American members with backgrounds in government, civil society, and industry.

Meeting from February 28th to March 3rd, the Working Group engaged in four days of intense dialogue centered on establishing best practices and recommendations to assist governments, tech companies and civil society in addressing illegal and harmful online content.  The key challenge, all participants acknowledged, is to protect the core principle of freedom of expression, even as governments and industries strive to reduce hate speech and disinformation.

The Working Group aims to ensure that policymakers in this space fully consider transatlantic perspectives, freedom of expression, and the due process of law.  Whitt contributed to the conversation as a strategic advisor, emphasizing the need to utilize systems thinking and design approaches that support functionally-robust analyses and solutions.

Key areas of agreement among the participants included:

  • Freedom of expression is a core value that needs protection by governments, the public, and industry.

  • Problems should be addressed in the most specific and concrete fashion, with an evidence-based approach to both problem definition and applied solutions.

  • Transparency by both governments and industry is an important enabler, to more accurately assess the necessity and character of government response.

  • Approaches should consider rights and obligations of all parties in the ecosystem — governments and users as well as platforms — and be sensitive to capacity differences among the players in all three groups. For example, over-regulation can have the unintended consequence of reducing opportunities for smaller players who have fewer resources to comply.

The Working Group will have a second meeting in May 2019, where it will focus on disinformation and on finding shared conclusions and recommendations. In the upcoming months it will present its discussions and preliminary findings, as well as receive feedback from stakeholders, at various round table meetings and other events in Europe and the US.

You read more about the Transatlantic Working Group project here.

March 12, 2019 - Institute Fellow Gigi Sohn Testifies Before House Judiciary Committee on Sprint-TMobile Merger

Our Distinguished Fellow Gigi Sohn testified before the House Judiciary Committee on Tuesday in opposition to the proposed merger of T-Mobile and Sprint, which is currently undergoing regulatory review. In her remarks, Gigi raised significant concerns about the merger, including its impact on rural and low-income communities.

“T-Mobile and Sprint have promoted themselves as low-cost providers and currently offer the cheapest data plans of the 4 nationwide mobile wireless carriers”, she testified. “As such, T-Mobile and Sprint have competed vigorously with each other, to the benefit of the “value consumer” seeking better rates and service plans. Just as important, the competition between Sprint and T-Mobile has had a moderating effect on AT&T and Verizon, forcing them to respond with lower prices and more attractive service options. All of this competition has benefitted consumers.”

“The merging parties don’t dispute that prices will go up, but argue instead without proof that the improvements to the quality of their service, no matter how minimal, will be worth the significant extra cost. That is a dicey proposition for the value and low-income consumers that are most attracted to T-Mobile and Sprint because of their less expensive postpaid and their innovative prepaid service.'“

“These higher prices will have a disproportionate effect on customers of prepaid service, who tend to be low income customers and people of color.

Sohn’s testimony also countered the companies’ arguments that the merger is necessary for the U.S. to compete in the deployment of 5G, particularly in rural areas. She stated:

“T-Mobile’s owned LTE facilities currently serve 83.1% of the rural US population, while Sprint serves just 56.2%. So, adding Sprint to the New T-Mobile adds nothing to T-Mobile’s current rural coverage.

“Finally, and perhaps most important, the merging parties [also] understate the challenges and costs of bringing 5G connectivity to rural areas. In places where population density is low and the challenges of steep terrain and thick fauna are high, deployment is both a technological challenge and expensive and revenues are hard to come by. Moreover, the high speed “special access” lines needed to bring 5G connectivity to rural America are also expensive and largely in the control of 3 companies –AT&T, Verizon and Century Link. . . Policymakers should be extremely wary of any promise to bring 5G to significant parts rural America in the absence of significant subsidies any time soon, if ever.”

You can read Sohn’s full written testimony here, and watch a recording of the hearing here.


March 6, 2019 - Fellow Deloris Wilson Speaks About Emerging Technology & Communities of Color at Dorothy Vaughan Symposium on Capitol Hill

Our Fellow Deloris Wilson today gave opening remarks on Capitol Hill at the Dorothy Vaughan Tech Symposium organized by Rep. Yvette Clarke (D-N.Y.).

The symposium brought together leading experts to talk about the impact of emerging technologies on black women and other marginalized communities - in particular, the threat of “deep fakes”, computer-generated graphics that can be used to falsify images, audio and video, and can easily be deployed as weapons of harassment.

Moderated by Mutale Nkonde of Data & Society, the symposium’s panel discussion featured Dr. Safiya Noble, author of Algorithms of Oppression, Dr. Joan Donovan, Director of the Technology & Social Change Project at the Harvard University Kennedy School of Government, Dr. Brandeis Marshall, Former Department Chair and Associate Professor of Computer Science at Spelman College, and Dr. Mary Ann Franks, Legislative & Tech Policy Director of the Cyber Civil Rights Initiative.

The speakers highlighted examples ranging from the use of fake social media accounts to suppress the African American vote by targeting supporters of the Black Lives Matter movement, to the use of deep fakes in pornographic settings to devastate the lives of women, including women of color. Panelists noted that historically underrepresented groups can be targets for disinformation due to their perceived inability to “push back” against online harassment. They also noted how biases embedded in training data for algorithms, the underrepresentation of marginalized communities online, the systemic privilege embedded in search algorithms and other online systems perpetuate bias and inequality. Panelists also engaged on the need for stronger, national legal protections to address the problems of deep fakes, to protect women from harassment and avoid a patchwork of state solutions and jurisdictional fights.

Rep. Clarke spoke at the event and stayed for its duration, engaging in meaningful Q&A with the audience after formal remarks.

You can read Deloris’s full remarks here.

March 5, 2019 - Institute's Congressional Briefing on Sprint-TMobile Merger Spotlights Tough Questions on Wireless Competition & Future of 5G

The Institute today hosted a packed briefing for Congressional staff, press and members of the public on the proposed $26 billion Sprint-TMobile merger, which is currently undergoing regulatory review.

The briefing brought together two former FCC Commissioners, Mignon Clyburn and Robert McDowell, representatives for the merging companies, and opponents of the deal for a rigorous conversation moderated by Institute Director Alexandra Givens.

Commissioner McDowell, now a partner at Cooley LLP advising the merging parties, faced off in a 25-minute debate with David Goodfriend, counsel to Dish Network and Communications Network of America, which both oppose the deal.

A subsequent panel featured Commissioner Clyburn (now advising T-Mobile), Seth Bloom of Bloom Strategic Counsel (counsel to Sprint), Yosef Getachew of Common Cause and Ben Moncrief of C Spire, a regional wireless provider. The latter two organizations both oppose the deal.

The conversation covered the merger’s potential benefits for expanding 5G coverage in the United States, which Sprint and T-Mobile claim will advance exponentially if the merger goes ahead. Opponents challenged whether the merger would actually lead to those touted benefits, noting that Sprint’s mid-band spectrum assets would not advance T-Mobile’s strength in rural areas, where low-band spectrum is required.

Opponents also raised concerns about several different ways the merger would reduce competition—an argument Sprint and T-Mobile countered by saying the merger was needed for both entities to compete with behemoths Verizon and AT&T. Emphasizing the potential impact on low-income communities, Yosef Getachew of Common Cause noted the merger’s effects for pre-paid mobile services, as Sprint’s Boost Mobile and T-Mobile’s MetroPCS would come under single ownership. Ben Moncrief of C Spire likewise noted the potential impact on regional carriers and Mobile Virtual Network Operators, which rely on wholesale contracts with the four major nationwide wireless carriers to deliver service to their customers. Countering these points, Sprint and T-Mobile argued that the merger’s synergies would cause 5G coverage to expand so greatly that the New T-Mobile would have every incentive to enter into wholesale agreements and keep prices low.

The briefing came as the House Judiciary Committee announced a hearing on the deal, scheduled for March 12. The House Energy & Commerce Committee held a hearing late last month.

Our Distinguished Fellow Gigi Sohn will testify at the House Judiciary Committee hearing next week. Follow us on Twitter @GtownTechLaw for future updates.

(From L): Institute Director Alexandra Givens, Seth Bloom, Commissioner Mignon Clyburn, Ben Moncrief & Yosef Getachew during Tuesday’s panel

(From L): Institute Director Alexandra Givens, Seth Bloom, Commissioner Mignon Clyburn, Ben Moncrief & Yosef Getachew during Tuesday’s panel

March 4, 2019 - Institute Hosts Workshop on GLIANet: Building A Trustworthy Open Web

The Institute today hosted a workshop for an innovative proposal being developed by our Senior Fellow Richard Whitt to rebuild trust on the web. His proposal, GLIAnet, advocates for a new model of “trusted intermediaries” — entities that would essentially act as digital go-betweens between users and the web.

The GLIANet model seeks to restore user agency, trust, and competition in the online ecosystem, as users voluntarily choose to share portions of their data with trustworthy entities, which then engage on the web on their behalf. These trusted intermediaries would employ a host of emerging technology tools—such as data lifestreams, personal AI avatars, localized cloudlets, and sovereign identity personas—to promote the interests of their clients. 

Perhaps most interestingly, the model leverages a market-based solution, while contemplating an important backstop role for regulation and public policy.

 Richard’s work has been covered in FastCompany’s “World Changing Ideas” column, and he has presented the GLIANet project at MozFest and Silicon Flatirons. The workshop brought together policy folks, academics, potential industry partners and more to discuss the idea and potential pathways forward.

You can read Richard’s paper describing the GLIANet vision here, and subscribe for updates at https://glia.net.


March 1, 2019 - CRISPR Gene Editing Takes Center Stage at Event Hosted by Tech Institute & O'Neill Institute for Public Health

We were thrilled to host a discussion today between two leading experts on CRISPR gene editing. Professor Debra Mathews of the Berman Institute for Bioethics at Johns Hopkins joined Professor Jake Sherkow of New York Law School for a rich discussion about the policy questions and legal landscape surrounding this potent new biotechnology.

Dr. Mathews’s academic work focuses on ethics and policy issues raised by emerging biotechnologies, with particular focus on genetics, stem cell science, neuroscience and synthetic biology. With other colleagues, Dr Matthews is the author of CRISPR: A path through the thicket, a paper in Nature that discusses the ethical questions of genome editing and present recommended actions for continued research.

Jake Sherkow is a Professor of Law at the Innovation Center for Law and Technology at New York Law School, where he teaches a variety of courses related to intellectual property. His research focuses on how scientific developments, especially in the biosciences, affect patent law and litigation. Prof. Sherkow is the author of over 30 articles on these and related topics in both scientific journals and traditional law reviews, including Science, Nature, and the Yale Law Journal and the Stanford Law Review online.

Both speakers generously agreed to let us share their slides from the event.

Dr. Mathews’s presentation, Human Genome Editing, Ethics & Policy, is available here.

Prof. Sherkow’s presentation, CRISPR, Patents and the Public Health, is available here.

Screen Shot 2019-03-06 at 10.58.44 PM.png

March 1, 2019 - Tech Institute Welcomes Adjunct Prof. Mark MacCarthy as a Senior Fellow

The Tech Institute is thrilled to welcome Georgetown adjunct faculty member Mark MacCarthy as a Non-Resident Senior Fellow. Mark teaches courses in Georgetown’s Communications, Culture & Technology Program on tech policy, governance of emerging technologies and competition policy in tech industries, as well as courses on privacy and philosophy and ethical challenges of AI in the Philosophy Department. He is also a Faculty Affiliate of the Georgetown Center for Business and Public Policy

Mark was previously Senior Vice President of public policy for the Software & Information Industry Association (SIIA), where he directed initiatives and advised member companies on technology policy, privacy, AI and ethics, content moderation and competition policy in tech. 

His published research is available at SSRN and commentaries can be found at CIO Online and at Forbes.

We’re glad to have Mark as part of the team. Welcome!

February 28, 2019 - Georgetown Launches New $55 million Center on Security & Emerging Technology

Big news for our tech policy efforts across Georgetown University: a $55 million grant from the Open Philanthropy Project is creating the largest center in the United States focused on Artificial Intelligence (AI) and policy at Georgetown.

The Center for Security and Emerging Technology (CSET), to be housed at Georgetown’s Walsh School of Foreign Service (SFS), will deliver nonpartisan analysis and advice to the U.S. and international policy and academic community on AI and other emerging technologies.

The Center will be run by Jason Matheny, former Director of IARPA, responsible for the development of breakthrough technologies for the U.S. intelligence community. He is a member of the National Security Commission on Artificial Intelligence, created in 2018, and co-chaired the Networking and Information Technology Research and Development Task Force on AI, which authored the National Artificial Intelligence Research and Development Strategic Plan released by the White House in 2016.

Technology & International Security

A key focus of CSET will be the ethical implications of the technological transformations in international security, drawing upon Georgetown’s considerable resources in this area.

 Themes and questions they plan to address include:

  • National competitiveness, assessing how countries compare across different areas of AI development.

  • Talent and knowledge flows, including trade and immigration policies, and policies governing the exchange of information. 

  • Relationships with other technologies, assessing AI’s impact on existing weapons technologies and other systems . 

Georgetown was chosen to host the center over a number of other top 25 universities, reflecting its pivotal role in the D.C. policy community and its strong commitment to engaging the ethical dimensions of contemporary policy challenges.

Technology & Society

CSET will become a key component of the university’s new Initiative in Technology and Society, bringing together research and teaching across Georgetown that focus on creating better and more ethical policies and uses for new technologies with an eye to better societal outcomes.

The Law Center and the Institute are playing a central role guiding that growth, in partnership with the university’s Beeck Center for Social Impact + Innovation, Massive Data Institute, Ethics Lab, and other centers.

Read the official CSET announcement here, and great coverage in The Washington Post here.