August 13, 2018 | By Rachel Wilder, Program Assistant 

Artificial intelligence for social good isn’t just hype. AI allows computer systems to perform tasks, like visual perception and decision making, that previously required human intelligence. Public and nonprofit sector leaders have an opportunity to increase their impact by applying AI to resource optimization and prediction problems outside the bounds of older methods: researchers and local government partners have used AI to better identify police officers at risk of adverse events like racial profiling or deadly use of force and to improve HIV awareness and testing rates among homeless youth. AI holds promise as a tool to approach a variety of social problems.

But in order for these benefits to be realized at scale, we need to overcome significant data infrastructure barriers. My brother, Bryan Wilder, is completing his PhD at the University of Southern California’s Center for AI in Society (CAIS). I found compelling overlap between the data needs that he and his advisor see and the Beeck Center’s work on data for social good. Three that stood out to me:

1. We need more high-quality data.

Despite the rapid expansion of public and private sector data collection, there often just isn’t enough data on the issues and people that AI for good can benefit most. For example, CAIS identifies many issues in public health (including outreach, disease tracking, and treatment decisions) that AI is well-positioned to address. However, public health data is especially scarce in the low-resource and developing country contexts where disease prevention could have the biggest impact. And as Gideon Rosenblatt and Abhishek Gupta recently commented in the Stanford Social Innovation Review, it isn’t enough that data is collected; datasets must be complete, accurate, and structured in order for machine learning systems to be developed.

2. We need to streamline data sharing across sectors.

Computer science researchers in academia have the energy and resources to apply AI to social problems, but they need access to data in order to do so. Even within interested social impact partner organizations, in-house data use restrictions can make the process of sharing with researchers prohibitively difficult.

A Beeck Center report published last year, “Accelerating the Sharing of Data Across Sectors to Advance the Common Good,” outlines a framework for governments and private companies to share data through a trusted intermediary with sensitivity to privacy and ethics concerns. This idea is echoed in discussion on the development of a “Data Commons” that would serve as a unified platform for data to be used in AI work. We should continue to push the conversation on getting data out of organizational silos and into the hands who can use it for good.

3. We need social sector leaders with data and technology literacy.

In order for governments and nonprofits to know that AI-driven solutions meet ethical considerations – including ensuring that racial and gender biases don’t influence results – there must be organizational leaders who can understand how algorithms arrived at recommendations or predictions.

Algorithmic bias is a serious and well-documented problem with AI, and it is especially important to uncover bias when working on social issues that affect groups already struggling with systemic bias. CAIS director Milind Tambe acknowledged that “[b]eing able to explain decisions an AI system has made to an end user is very important,” noting that “[i]n many cases, we are working with vulnerable communities and populations, and we need to ensure they will not be harmed.”

The Beeck Center and Deloitte’s Center for Government Insights have co-produced a playbook for Chief Data Officers in government that explicitly addresses this subject, giving data leaders a roadmap for understanding and managing algorithmic risks. Beeck Center researchers have also published a framework for ethical blockchain design that can serve as a template for ethical design in other technologies, including AI. This type of training for data officials in the public sector and implementers of new tech solutions will become increasingly important as AI becomes more common.  

The practice of addressing social good questions with artificial intelligence is young, and it’s exciting to me to envision how AI tools could amplify the impact of public programs if they are successfully applied at scale. Enabling that future will require investing in data collection, sharing, and literacy. My colleagues at the Beeck Center are working at the heart of advocacy and education efforts to make those investments a reality – so stay tuned!

“I have nothing to hide” is a tired justification that we can no longer use when it comes to our data privacy. I think we will find too late that the importance of privacy has nothing to do with compromising information on an individual level and everything to do with the information and power we have collectively given away.

June 4, 2018 | By Lara Fishbane, Research Assistant

On April 30, Anna Lauren Hoffmann published an article on Medium that outlines example after example of how the data collected on us is inconspicuously damaging any efforts we’re making toward a better, more equal society. Even as we tell women that they can be engineers, architects, lawyers, entrepreneurs, and presidents, Google Translate is subtly suggesting otherwise. As we march the streets demanding that it be understood that Black Lives Matter, Facebook is systematically failing to identify black men and women as people.

The dangers Hoffman points to are frightening and real. And the imperative for action in an increasingly hyperconnected and technology-enabled world is critical. How do we build a world where our algorithms are attentive to social consequences? And, perhaps even more important, how do we reach a world where data and technology solve for social inequality?

We can say again and again that what we need is more diversity in tech—and it’s true, we definitely do—but the problems in our algorithms will persist. Even a perfectly diverse group of engineers would inevitably be constrained by algorithms that learn from a world of biased outcomes. In other words, the problem remains that algorithms rely on data collected in a society of systemic oppression, bias, and inequity. Even a “neutral” algorithm cannot escape that fact.

Perhaps then the call is to eliminate categories such as race, gender, sexual preference, etc., from any automated decision-making processes. However, even without explicit groupings in our data, we still run the danger of perpetuating biases. For example, imagine a hiring algorithm employed by a company looking to fill a vacancy. Even if the algorithm is blind to names, addresses, race, and gender, it’s possible that the algorithm picks up details that correlate with these categories. A person may have attended a high school in a predominantly non-white neighborhood, may use adjectives or syntaxes that are native to certain cultural backgrounds, or have participated in affinity groups that correlate with social groups. Though these algorithms don’t consider race or gender, they will ultimately reproduce the same biases that already corrupt our hiring processes, while operating under the guise of neutrality.

And so, regardless of how little you think you have to hide, the collection of your data is dangerous. It allows companies to form the types of correlations that actively threaten social progress. Even if the data isn’t being used to make discriminatory hiring, loan, or credit decisions, the potential for harm is no less real. Think of Facebook, for example, who is using your data to improve your ad experience. Masked by the false pretense of an improved experience, the consequences of something so seemingly benign are worth being considered. Are we okay with a society in which men are more often shown advertisements for high-paying jobs than women? Or one in which Google searches for black-sounding names are associated with criminality? What about one in which low-income consumers are inundated with gambling-related advertisements? In aggregate, it’s impossible to say that these decisions about how our data is collected, stored, sold, and used don’t matter.

Europe’s General Data Protection Regulation attempts to solve for some of these concerns around privacy and automated decisions. For example, Articles 13, 15, and 22 grant users the right to an explanation of how their personal data is being used to arrive at decisions. Recital 71 grants them the power to challenge that decision. The development of these articles is not insignificant and represents part of a larger conversation around taking “back” (did we ever have it?) control of our data and the usage of it. But it is likely to be limited — source code is too esoteric for lay people to understand, and a lay explanation might miss the complexity of how the algorithm is actually functioning. Further, even if the outputs are understood to be unfair, it seems unduly burdensome to shift the responsibility of challenging the decision to the end user. Marginalized and oppressed groups already often bear the brunt of needing to redress the injustices enacted upon them.

What’s really needed—and I am certainly not the first to suggest this—is a code of ethics around data and how it gets used in algorithms. Such a code should be underpinned by values of equity and fairness, and reflect the world we want to live in. Moreover, perhaps counterintuitively, it should be something that is less-well defined rather than more. A vagueness pushes companies to strive for better whereas hard lines are something to be reached and not exceeded. Further, there should be trusted third parties whose job it is to vet these algorithms and represent the rights of the end user.

This premise is not without its own challenges. Namely, the practicality of developing a code of ethics that adequately represents the rights of people, not those with vested interests; the creation of a new and fair marketplace for vetting and authenticating code; and the protection against the creation of perverse and dangerous incentives that may develop when third parties are paid to be the arbiters of fairness.

However, these are not sufficient reasons for inaction. Imperfect solutions that strive for something better through thoughtful and collaborative design are better than just letting our systems continue as is. It is our responsibility to not leave this one unresolved.

The data collected, released, and produced by the government has the potential to be leveraged for social good, but concerns about privacy and citizen’s rights are paramount.

May 1, 2018 | By Hollie Russon Gilman, Senior Fellow & Ali Shahbaz, Student Analyst

As the open data movement continues to evolve, the role of Chief Data Officers and institutional design matters for the implementation of data-driven governance and decision making. However, it is not enough to think about the supply side of public sector data. We also need to think about the demand side. There are a few core components of this, which include engaging with civil society, training the next generation of public servants, and effectively working to equip individual citizens with data privacy and rights.

Data is an asset for civil society and philanthropy, which can play an intermediary role, something that Lucy Bernholz calls “digital civil society.” Established initiatives such as the Mastercard Center for Inclusive Growth are already building the infrastructure for data philanthropy. However, especially in smaller organizations, there can be capacity issues to ensure that data is effectively used and deployed. Civil society and philanthropy can also play a role to ensure that there is a conversation surrounding the normative value and ethics behind which data is released and for what social purpose. This requires civil society working together to collectively build tools and resources that address data security, stewardship and access — as Josh Levy & Katie Gillum recently wrote in Stanford Social Innovation Review. Because so much seemingly private information can now be easily accessed, it is essential for social justice organizations to collaborate in order to ensure that one organization is not inadvertently jeopardizing other missions.

Second, in order for the public sector to effectively leverage data there needs to be training and a recognition within institutional structures that data is a catalyst for internal decision making as well a public asset for people, business, and society. Building an architecture of innovation, which we have written about at the Beeck Center, helps create a structure to ensure better institutional design between the core pillars of governance. There are serious legal and cultural challenges to effectively sharing data across agencies and different levels of government (e.g. state, local, and federal). In addition to modernizing software we also need to equip public servants with a range of skill sets, including upskilling current public servants with data and tech literacy training. San Francisco and Kansas City, in addition to the Department of Commerce, have already launched their own “Data Academy Programs.” Any of these initiatives also requires high-level political leadership and air time.

Finally, how can we make data an asset for citizens? There is enormous amount of value in the data that citizens hold and generate (both individually and through social networks). The European Union’s General Data Protection Regulation (GDPR), scheduled to take effect in May 2018, demonstrates a path toward reliable online privacy balanced with transparency. The GDPR is the first legal bill of rights for personal data. One of the most exciting aspects of the GDPR is the concept of “data portability,” which empowers consumers to have a clear record of their personal data so that they can choose if and how they want their data to appear. GDPR also offers the “right to be forgotten” — if someone wants their data removed from an app or company, now it is a possibility. There is no doubt that regulatory instruments like the GDPR will be a milestone in standardizing best practices for data transparency, ownership, consent and sharing. However, there will be interesting questions about if and how the U.S. responds and what the role of other institutions will be to comply.

The data that people hold will continue to be extremely valuable. There may be opportunities for leveraging individual data for the public good, such as in the Human Genome project. However, its implementation requires an individual and institutional understanding of data usage, protection, sharing and integration. Throughout all these conversations it is essential that questions of equity, digital access, and digital literacy are placed front and center. Without the regulations to retain and protect people’s data, we run the risk of growing digital inequality in our already deeply unequal society.

Opening government data has the potential to build trust between citizens and the state while pushing for better public outcomes.

April 19, 2018 | By Madison Suh, Student Analyst

On April 9th, the Beeck Center, in partnership with the MacArthur Research Network on Opening Governance, convened over 60 government and industry leaders at Georgetown University to discuss the relationship between data, trust, and governance. The center has been focused on the opportunity for leveraging data for social good, and was pleased to convene this event as the final in a three-part series on the future of open data.

Many governments have committed to open data policies and practices, yet there is a need for a more nuanced discussion on data governance. With this goal in mind, the center invited practitioners and leaders to join a dinner and policy discussion on how best to govern data and build trust.

One of the first questions guiding the April 9th conversation was whether the opening of government is an appropriate response to mitigate diminishing levels of trust between the citizen and the state. The panel launched into the discussion with a brief history on the evolution of open data. The open data movement began with the massive release of data into the public domain, where data had previously been largely unstructured and unmined. The first wave was the purposeful release and utilization of data in ways that benefitted citizens and built critical infrastructure. This development was followed by the mobilization of citizen feedback in response to governance structures and services. Finally, government has focused on how to respond to citizen feedback in order to deliver better outcomes; it is this domain that offers an avenue to build trust.

With trust in governance institutions at historic lows, governments need to prioritize closing the feedback loop between citizens and the state. As Christopher Wilson, Visiting Fellow at the Beeck Center, said, “there has to be a certain amount of trust that data is being interpreted in good faith and used for their intended purpose.” Panelists Sanjay Pradhan, CEO, Open Government Partnership; Beth Noveck, Professor, NYU Tandon School of Engineer; and William Eggers, Executive Director, Deloitte’s Center for Governance insights urged the audience to have hope in the power of data to restore citizen’s trust in government and offered examples, including: data prediction capabilities that impact policy decisions and resource allocation; the creation of data commons that prioritize transparency; and a government-sponsored platform for blended public and private data. Data pipelines could reduce friction between the private and public sectors, while customary experience principles could be applied to curate and visualize data in accessible ways, and untapped data sets could be used to further the development of products, services, and research.

Various themes emerged from the event, offering key insights and suggested approaches on data governance:

Shift Culture to Breed Trust

The panelists suggested that responsive and responsible use of data will drive an incremental culture shift in how data is governed and used to build trust. Data governance is essential at every step of the data life cycle. The transparency of processes — an honest assessment of both opportunities and challenges — is critical to foster institutional readiness and responsible stewardship of data. To build trust, there must be a cultural shift towards innovation and public entrepreneurship, with co-creation between the private and public sectors. However, panelists noted that shifting culture is “hand-to-hand combat,” particularly “where powerful elites benefit.” Building a climate of trust will require coordination between government and civil society and a balance of the risks and benefits of how data is collected, analyzed, and used.

Prepare for Future Obstacles

It was made clear from the discussion that the greatest promises for the future of data governance, if left unmoderated, could also pose some of the greatest risks and challenges. For example, artificial intelligence and machine learning, if left unchecked, could prompt ethical and normative challenges for the future of democracy, including privacy and cyber risks. Both individual risks (privacy, security, and personal safety) and organizational risks (confidentiality, liability, and intellectual property) are active concerns.

Engage Citizens As Participants

Above all, the panelists said, governance models should be citizen-centric, so that citizens are heard and responded to. The panelists stressed the importance of citizen participation and state response to foster and emphasize a trusting relationship. For example, participatory budgeting enables direct decision-making powers for citizens to influence policy outcomes within the government. This is a tool to build trust in government, as it has done in South Kivu province of the Democratic Republic of the Congo, where mobile phones and town halls were used to create a line item vote and increase citizen participation.

Rebuild Trust Collectively

According to one of the event’s participants, the open data movement requires a broader range of communication and analytical skills as well as the ability to assess the risks, benefits, and limitations of data usage, access, protection, and sharing. Multi-sector stakeholders, including industry, academia, and NGO’s, need to be part of this process to uphold standards.

One of the outputs of our three-dinner series with a group of multi-sector data leaders from across government, academia, civil society, and the private sector is a forthcoming Chief Data Officer playbook co-published by the Beeck Center and Deloitte’s Center for Government Insights. The publication focuses on a range of key aspects of the open data discourse, including the history and evolution of the role of Chief Data Officers in the government; the use of data as an asset for public policy; the translation of data into storytelling tools; and the evaluation of data ownership, sharing, privacy, and stewardship.

Ensuring the future of open data will require all actors to share a greater sense of accountability. In addition to the foundational principle of doing no harm, there is an inherent responsibility to do good.

For more information, visit www.beeckcenter.georgetown.edu or email us at beeckcenter@georgetown.edu.

 

With blockchain technology in its formative stage, developers and practitioners have an opportunity to set ground rules that will protect people and ensure an ethical approach to applications for social good.

March 15, 2018 | By Lara Fishbane, Research Assistant

On March 8th, the Beeck Center hosted a dinner for industry practitioners and policymakers to preview our forthcoming Blockchain Ethical Design Framework and to discuss actionable strategies for the responsible development and implementation of blockchain solutions for social impact. In an effort to move discussion on blockchain’s value beyond the current media cycles of hype and despair, we convened 50 leaders for a conversation about the technology’s ethical implications and how to advance approaches that link the design of blockchain to human outcomes.

Sonal Shah, executive director of the Beeck Center, opened the night with a reminder that this conversation is part of an important, broader conversation that society should be having now about data, technology, and ethics. As technology has become increasingly cheap, capable, and ubiquitous, its potential to enable solutions that benefit marginalized and underserved communities has also increased. Globally, organizations are calling for technology-based solutions that improve people’s lives. At the same time, we’re realizing that technology is not neutral. “Values are always embedded in technology,” Shah said. Blockchain is no exception.

Beeck Center Senior Fellow Cara LaPointe, who leads our Blockchain for Social Good effort, pointed out, “the technology is developing at a pace much faster than our ability to create governance around that technology.” As evidence, hundreds of blockchain for social good pilot projects are already in the works. Blockchain for Change, a startup in New York, is exploring blockchain’s potential for distributing services to the city’s homeless population, and a number of global organizations are leveraging the technology to help refugees access financial services. How solutions like these are designed and implemented will have real ethical consequences for people. With the technology in its formative stage, developers and practitioners have an opportunity to set the ground rules that will protect people before any ill effects are cemented into standards.

LaPointe highlighted blockchain’s key characteristics: transparency, trust, and immutability, noting that these are not just characteristics; they are also values. The system is set up with an underlying idea about how the technology, and, by extension, the world, should be: transparent, decentralized, secure, auditable. And this world offers new promise in terms of creating and delivering untapped social value. Panelist David Treat, Managing Director at Accenture, spoke to blockchain’s potential for creating new and decentralized identity systems as well as bringing value directly to small businesses and farmers. Katherine Foster, a blockchain specialist at the World Bank, added that it more broadly could be leveraged to meet development goals, through efforts such as the distribution of food aid.

However, parallel to blockchain’s unprecedented opportunities lie ethical challenges. Each value is coupled with risk. Immutability offers security, but it also means that potentially erroneous information is made permanent. Transparency offers access, but it also leaves open the chance of exploitation for vulnerable populations. Rules-based trust offers decentralized collaboration between many parties, but as the Chief Digital Officer of CARE Macon Phillips pointed out, certain protocols to achieve consensus between these parties require large amounts of energy that are dangerous to the environment. When asked what makes ethics so difficult to achieve, LaPointe explained, “it’s complicated because everything is interconnected.”

LaPointe emphasized that each choice, no matter how small, has huge ethical implications. People are using the technology for the good it can provide, but they should think about the ethical responsibilities that come with it. Panelist Natalie Evans Harris, Chief Operating Officer at BrightHive, underscored the need for an ethical approach, reminding us that blockchain is about data and “data is people.” Real people’s lives will be affected by any solution that blockchain enables. To ensure that blockchain solutions deliver social good for all, developers should be intentional about building in inclusion and equity to the design.

Panelist Rahul Chandran, Executive Director of the Global Alliance for Humanitarian Innovation, drove home the message, “I am unconvinced that blockchain is going to save humanity from itself. Without an ethical code, nothing will get to scale, we won’t get the partnerships together, and we also won’t try anything because we don’t get to experiment on people without ethical standards.”

The Beeck Center’s Ethical Design Framework is an actionable tool for the continued development of the technology for social good. While blockchain comes with inherent risks, that is not a reason to dismiss its potential social value. As Harris pointed out, “At some point, technology moves forward because it’s a benefit to individuals and society.” LaPointe added that if a technology can help people, it becomes an ethical obligation to use it. It’s important that it also be designed ethically and intentionally.

Proposed Guiding Principles for Opportunity Zones to Fuel an Inclusive Economy and Drive Social Impact

March 13, 2018 | By Lisa Hall, Senior Fellow

What if economic tax incentives designed to improve the place you call home don’t consider your needs? What if tax benefits, instead, focus on high-end projects that don’t require a federal tax subsidy to be successful, creating a new economic reality that feels far from the home you know.  Opportunity Zones are  a brand-new mechanism established by Congress, designed to drive private capital into distressed areas through deferred taxes on capital gains in the United States. How can these Zones and the Opportunity Funds which will invest in them be carefully constructed with the people who are living in underserved communities at the heart of decisions?

Place based strategies are commonly employed by community development practitioners and policymakers to achieve social impact. Opportunity Zones have the potential to enhance and bolster existing place based strategies that currently benefit low-income communities, including Promise Zones, New Markets Tax Credits and Choice Neighborhoods. Opportunity Zones also have the potential to do harm, as Ada Looney contends in his recent Brookings post. And Opportunity Zones, as emphasized in the recent article by Rachel M. Cohen in the Intercept, can sometimes have unintended consequences.

Consistent with our belief that economic policies should be implemented in a way that considers and serves the people in the communities affected, the Beeck Center for Social Impact at Georgetown University, in partnership with the Kresge Foundation, convened an expert group of community development practitioners to explore how Opportunity Zones can drive capital to communities in a way that truly benefits the individuals and families that currently live and work there. We asked ourselves a simple question:  How can Opportunity Zones be used as a tool for community development and not solely a tool for financial gains.

In response to this question, we drafted proposed guiding principles for the designation of Opportunity Zones. The principles are intended to serve as a starting place to help guide the designation process and, ultimately, the creation of Opportunity Funds that can best serve the people currently  living and working in these areas, which by definition in the statute, must be low-income census tracts. The following principles are presented as a straw-person for discussion. These principles are not meant to be prescriptive; but rather to engage conversation and embrace the opportunity for social impact.  We invite feedback by sending an e-mail to me at lisa.hall@georgetown.edu.  Comments will be collected and shared with the working group.


Proposed Guiding Principles for Opportunity Zones to Fuel an Inclusive Economy and Drive Social Impact

1. Recognizing that Opportunity Zones will deliver publicly funded tax incentives and subsidy to communities across the US, the state selection process should include as a key objective, the goal of delivering public benefit to a range of stakeholders, not limited solely to private investors, but also benefitting current residents of low-income communities, community development organizations, community service organizations, and social enterprises.

2. Where possible, Opportunity Zones, should be selected in combination with state tax incentives and allocations by states for other government programs that directly benefit low-income households and communities, such as the Low-Income Housing Tax Credits and New Market Tax Credits. Benefits generated in Opportunity Zones should be additive to existing efforts and not cannibalize existing or prospective community development investments like those motivated by the Community Reinvestment Act.

3. Impact objectives for Opportunity Zones should be established and tracked, including but not limited to goals for raising the standard of living for current residents. Examples include output goals like number of new businesses created, living wage jobs created and affordable housing units. Outcome goals, like increased median household income and improvement in health statistics should also be considered.

4. States should adopt methodologies for selecting Opportunity Zones that are consistent with effective evaluation standards and best practices for research design to facilitate ongoing monitoring of zones, leveraging evaluation resources available from academic institutions.

5. The selection process for Opportunity Zones should consider the capacity of neighborhoods to absorb private capital and existing infrastructure needed to enable investments in businesses as well as real estate.  States should seek to integrate investments generated by the tax benefit to complement and leverage existing and prospective economic activities in designated Opportunity Zones.

6. Opportunity Zones should be selected with consideration given to environmental issues. States should encourage or mandate that businesses located in Opportunity Zones adhere to environmental best practices.

7. Efforts should be made to ensure that current residents of Opportunity Zones are able to remain in neighborhoods or can benefit from rising property values. Examples include state and local tax abatements for low-income homeowners.

8. A balance of rural and urban neighborhoods should be selected to diversify investment activity and to ensure that rural areas are eligible for investment. Opportunity Zones should be selected in a geographically targeted manner so there can be a sufficient investment of resources in each Opportunity Zones.

9. States should identify and support community development intermediaries, like CDFIs and community banks, that can provide debt financing to support businesses and real estate that will benefit from equity investments from Opportunity Funds.

10. In addition to prohibited business activities like gambling and liquor stores, states should discourage the creation of new businesses in Opportunity Zones which disadvantage low-income communities like payday lenders.


Speed is of the essence to put these principles into practice. Several groups including the Economic Innovation Group and The US Impact Investing Alliance have been advocating for and helping to craft what was originally known as the Investing in Opportunity Act. And many in the community development field and impact investing world have embraced the concept of Opportunity Zones and Opportunity Funds,  successfully incorporated with bi-partisan support into the  Tax Cuts and Job Acts  passed at the end of 2017.  State governments and territories have also embraced the new legislation and are already selecting Opportunity Zones, to comply with the legislative requirement that Governors designate low-income census tracts prior to a March 21, 2018 deadline. Some states have hit the ground running, launching websites to solicit input and comments on the designation process. Local and national non profit organizations including Enterprise Community Partners, Council on Development Finance Agencies, and LISC are supporting efforts to raise awareness about the program, providing resources and analysis of the legislation, and by engaging community development organizations in the state by state designation processes.

We believe this new tax benefit creates an opportunity to improve low-income communities in underserved rural and urban areas by attracting more private capital to finance small businesses, community services and social enterprises. But, if Opportunity Zones and Opportunity Funds are designed in ways that solely benefit activities and projects that do not need subsidy to succeed, including high end, real estate based projects, then the legislation will not meet its potential for delivering meaningful impact.  Opportunity Zones can and should create living wage jobs, improve community assets, and help build wealth for people in places that have not yet recovered from the global recession.

Check out Lisa Hall’s interview on KALW Local Public Radio!


Lisa Hall is a Senior Fellow at the Beeck Center for Social Impact + Innovation at Georgetown University, which engages global leaders to drive social change at scale.  She has dedicated her 25-year career to economic and social justice, impact investing and community development.  Lisa has served in executive roles across multiple sectors in the United States and abroad, including time as CEO at Calvert Impact Capital and Managing Director at Anthos Asset Management. Her area of focus at the Beeck Center is the inclusive economy, exploring how social innovation and access to opportunity can drive prosperity for all communities. She is active on Twitter @lisagreenhall

March 8, 2018 | By Cara LaPointe

Download the Blockchain Ethical Design Framework (PDF)

.

Executive Summary

Blockchain technology can create scalable social impact and has the ability to change people’s lives. Emerging applications are demonstrating blockchain’s social value – from smart contracts that hold both parties to their agreements to transparent land registries to digital identities for refugees. The social effects of blockchain can be powerful and lasting. Hence, making intentional, ethical decisions in its design and implementation is critical to ensure the technology’s potential for transformative change.

Blockchain is a digital distributed ledger technology that has the potential to provide secure and immutable records of distributed and sequenced information or transactions. Blockchain does not require a central trust authority to verify information or authenticate transactions; rather, the rules are pre-written into code defining how actors can behave in the system. It is this unique combination of attributes – transparency, trust, and immutability of transactions – that makes blockchain technology appealing. Depending how it is designed, blockchain can also produce a wide range of actual consequences for people. The technology’s flexibility and extensibility, along with its immutability, transparency, and rules-based trust, demands a thoughtful, shared approach to its design and use.

The Beeck Center for Social Innovation + Impact at Georgetown University, as a learning partner with The Rockefeller Foundation, has developed the forthcoming Blockchain Ethical Design Framework as a tool for practitioners to drive ethical intentionality into the design of blockchain technology for social impact. We are interested in blockchains because, as a technology in its formative stage, it offers an opportunity to rethink how society can best leverage data and technology for social impact.

Ethical Design and Implementation

There are important, ethical considerations of blockchain’s design for human lives, especially for vulnerable and marginalized populations. Intentionality of design is critical, both as solutions are scaled and standards are established. Technology is never neutral; it affects people in both helpful and harmful ways. Values are always embedded in the technology, even when not overtly recognizable.

This is especially true of blockchain. For example, a digital identity system can provide an immutable and secure identity that is uniquely linked to a person’s biometrics, such as their fingerprints and iris scans, which could allow refugees, who have lost everything to cross a border, or access vital aid and medical services. However, how private, personal information is recorded on a blockchain and who has access to it could also expose refugees to exploitation now or in the future.

How the system is coded, who has access to it, and which rules govern it have intentional and unintentional consequences. Understanding the ethical impacts of each of these decisions matters. To ensure the best outcomes for individuals and communities, blockchains should be intentionally designed with people in mind and guided by an ethical approach. The framework walks through a conventional design process that has been expanded to focus explicitly on how to apply an intentional approach:

  • Define the problem being addressed and the desired outcomes
  • Explicitly identify the ethical approach
  • Assess the ecosystem of the desired outcome
  • Determine the guiding design philosophy
  • Determine if blockchain is an appropriate technology choice

Once blockchain is selected as an appropriate technology, the framework then moves iteratively through a detailed analysis of six root issues: governance, identity, verification/authentication, access, ownership of data, and security. At each stage, guiding questions serve to identify the effects of the design choices on the end users and communities.

  • How is governance created and maintained?
  • How is identity established?
  • How are inputs verified and transactions authenticated?
  • How is access defined, granted, and executed?
  • How is ownership of data defined, granted, and executed?
  • How is security set up and ensured?

Moving Ethics into Action

The promise of blockchain is real. Its key attributes of transparency, trust, and immutability have the potential to have real impact by increasing efficiency, security, and verifiability in the way that organizations operate, access to services is delivered, data is stored and controlled, and assets are tracked. However, the realization of this potential requires an ethical approach that recognizes the relationship between design and human outcomes.

As blockchain solutions are built and deployed, the Blockchain Ethical Design Framework provides a way to ensure that social value is protected. The diverse group of experts convened to inform this work need to continue to be at the forefront of efforts to bring ethics to action. As such, the Beeck Center is working with standards organizations and practitioners to integrate this framework within broader initiatives addressing digital inclusion and the ethical implementation of data and technology. From practitioners to policymakers, we all share the responsibility to continue the conversation and demand intentional ethical approaches in the design of data and technology for social good.

Download the Blockchain Ethical Design Framework (PDF).

For more information, visit www.beeckcenter.georgetown.edu, or email us at beeckcenter@georgetown.edu

January 22, 2018 | By Lara Fishbane

The more momentum blockchain has gained in the social good space, the more eager pundits have been to question the very ground it stands on. Blockchain has, time and time again, been labeled a technology solution looking for a problem to solve.

What I’ve always found interesting about these claims, though, is that blockchain was developed to solve a particular problem: trust. Satoshi Nakamoto’s original whitepaper outlines the dangers of over- and misplaced trust in financial institutions before introducing the electronic payment system that eliminates the need for it. The idea is that if people are able to transact directly, they do not need to rely on potentially corrupt and costly intermediaries.

After all, intermediaries are themselves an imperfect solution to the trust problem. The fundamental issue is that people, who do not necessarily trust each other, want to transact in a way that protects both of them. And so rules are established to ensure honesty and, consequently, protection: be who you say you are, send what you say you will, do not send what you are not in possession of. The problem is that humans, in practice, stray from rules. The intermediary steps in to create systems and repercussions that dissuade noncompliance.

But what happens when the intermediary strays? When, in fact, intermediaries are just groups of people who are as likely to veer from the rules as the rest of us? When, in fact, they are an opaque structure that profits off of our dependence on them? Enter Bitcoin. The perfect intermediary. Code that is open source and transparent. Code that is, by nature, disinterested in profiting off of us. Code that acts predictably and cannot stray. Code that makes it so we cannot stray, cannot go back on our word (immutable), and can see what everyone else is doing (transparent).

All of this makes Bitcoin exciting. People see an immutable, secure public ledger that everyone can trust and the potential for good seems boundless. Just think of all the potential for data integrity and storage! But there is a clumsiness here, a misconception. The bitcoin blockchain isn’t fundamentally about data storage; it’s about how people transact. The ledger and all of its characteristics just create a system whereby people can only transact in ways that are truthful. There is no way for bad data to enter the system and no way to corrupt it.

And yet, there is something truly compelling about blockchain technology’s potential beyond Bitcoin. An immutable ledger introduces the possibilities in the social sector such as secure land records in communities with no formal claims to land, an identity system for those who have been denied by financial institutions, and so many other applications. But unlike Bitcoin, these blockchains are not truth machines. There is no magical mechanism that prevents false property titles, fraudulent identity claims, or incorrect provenance information from being added to the immutable ledger.

Absent a truth mechanism, an intermediary is likely still needed to vet and authenticate information. It is once again the intermediary’s job to ensure that the information can be trusted. Here it might seem that blockchain leaves us off where we began, once again dependent on a potentially corrupt intermediary. However, what blockchain technology can do is decrease the likelihood of an intermediary acting in corrupt ways while also increasing the agency of the end user. This potential comes down to the design of the blockchain.

Immutability can prevent an intermediary from corrupting information on the system and transparency can be used to trace the passage of information. Though immutability and transparency may seem like design flaws in a system that handles personal information, the system can also be designed to protect and empower users. There can be a redress process that protects user interest and transparency of metadata, rather than compromising potentially sensitive information.

What blockchain really introduces is the opportunity for a power shift. It can give users tools to trust the intermediary so they do not have to do so blindly. It can also distribute the power of any one intermediary through a multi-actor system, and give users ultimate control over their own data and how it’s used. The design of the blockchain can open the potential for a transformative paradigm shift.

However, there is also the danger that blockchain technology opens users to new risks and might put them at the hands of a more opaque and corrupt system. Hence, the Beeck Center for Social Impact + Innovation is working to develop an actionable framework for mitigating these privacy and ethical concerns through design choices. We want to ensure that the decision makers in the blockchain space have a toolset for driving social good with this promising technology while reducing harm to the user.