Data and the Backstage of Politics — Interview with Paul Wescott

In contemporary politics, data sits at the center of how decisions are made, messages are crafted, and voters are reached. From campaign targeting and voter segmentation to policy research and civic engagement, the intelligent use of data is transforming not only how politics is practiced, but also how it is understood. At the same time, as technological sophistication grows, so does public skepticism, particularly toward traditional tools such as opinion polls. This evolving landscape raises important questions about how data can be used responsibly, effectively, and for the public good.

To discuss these themes, I spoke with Paul Wescott, Executive Vice President at L2 Data, one of the leading providers of voter, consumer, and constituent information in the United States. Over the past decade, Wescott has helped drive L2’s growth across politics, academia, corporate affairs, and digital partnerships, positioning the company at the intersection of data, strategy, and civic innovation.

Before joining L2, Paul built a career in journalism and media, working for outlets such as NBC News, Fox News Channel, and iHeartMedia. Drawing from that experience, our conversation also touches on a critical aspect of modern politics: how to communicate insights derived from data to broader audiences, bridging the gap between analytical depth and public understanding.

Data building: The underworking of Politics.

At the heart of L2 Data’s work lies an ambitious and highly technical mission: to maintain and continuously refine two of the most comprehensive datasets in the United States—one for registered voters and another for consumers. Together, these databases represent nearly the entire adult population, with around 218 million registered voters and 262 million consumers. As Paul Wescott explains, these datasets form the foundation for a wide range of clients, from political campaigns and advocacy organizations to academic researchers and government officials. Every record is standardized, cleaned, and enriched with hundreds of demographic, behavioral, and attitudinal variables—creating what Wescott calls “the underworking of politics,” the invisible infrastructure that allows modern campaigns to target, mobilize, and communicate with precision.

This process, however, is far from simple. Data from voter registries across all fifty states, and in some cases, individual counties, must be consolidated, verified, and updated. Duplicates are removed, deceased individuals are filtered out, and changes in residence are tracked to maintain accuracy. But what distinguishes L2’s approach is the integration of voter data with consumer information, obtained from licensed commercial providers such as credit bureaus and marketing firms. Through advanced data linking, each record can be matched to a real individual’s demographic profile, digital identifiers like IP or mobile advertising IDs, and behavioral predictors. This allows clients, for example, to locate “every Democrat who voted in three of the last four primaries in New York City” and reach them via phone, email, or digital advertising. It’s a powerful tool that underpins much of the contemporary machinery of political communication.

Working at this scale inevitably raises ethical and legal questions about privacy and consent. Wescott is candid about this challenge: “We consider ourselves stewards of the data,” he notes, emphasizing that while much of the voter file is public information, the company applies strict safeguards to prevent misuse. Personally identifiable information (such as names, addresses, or birth dates) is handled carefully, and sensitive data like Social Security numbers or credit scores are explicitly excluded. Moreover, in response to growing public concern and state-level privacy legislation like California’s Consumer Privacy Act (CCPA), L2 implemented a nationwide “universal opt-out” policy. Any individual can request removal from the database, and once excluded, they are treated “as if they had died”, permanently erased from all future updates. It’s an unusual step in the data industry, one that reflects a deliberate balance between commercial utility and respect for individual privacy.

This awareness of ethical responsibility connects to a broader theme in Wescott’s thinking: the intersection between data, theory, and innovation. He rejects the notion that data work must be purely operational or detached from the intellectual frameworks of political science. In fact, L2 actively collaborates with research institutions such as NYU and other elite universities, where their datasets serve as a “source of truth” for empirical inquiry. These partnerships not only validate L2’s data integrity but also inspire new ways of conceptualizing political behavior. As Wescott describes, even internal innovation can emerge from theoretical curiosity. One recent project, initiated by one of L2’s engineers, seeks to map family relationships within the voter file, linking siblings, parents, and extended relatives across states. The idea, born from a mix of technical experimentation and sociological insight, could open new avenues for understanding political influence and social networks within families.

What stands out from Wescott’s account is a vision of data work that goes beyond selling lists or optimizing campaign outreach. L2’s mission embodies the tension and potential at the core of modern political data: using technology to illuminate patterns of civic participation while maintaining ethical vigilance and methodological rigor. In doing so, Wescott and his team contribute not only to the mechanics of electoral politics but also to the broader conversation about how data can serve democratic understanding. Their work underscores an essential truth: behind every sophisticated model or campaign strategy, there is a human decision about how information should be gathered, interpreted, and used responsibly.

Communicating with data and the role of AI.

One of the most striking aspects of Paul Wescott’s experience is how it bridges two traditionally separate worlds: the precision of data analytics and the art of communication. Having worked in major news organizations like NBC and Fox before joining L2, Wescott brings an acute awareness of how easily data can be misunderstood by the general public. In his view, numbers alone rarely speak for themselves. When L2 publishes early voting data or turnout models on platforms such as X (formerly Twitter) or LinkedIn, the challenge is not just technical accuracy, it’s clarity. “We have to make sure people don’t misinterpret what we’re showing,” he explains. In one example, Wescott described how early ballot data in Virginia showed Democrats “outpacing” Republicans, even though the state does not register voters by party. L2 had modeled the results to estimate partisan lean, but without careful explanation, audiences could easily mistake the data for official statistics. This, he says, is the fine line between information and misinformation, a line that every data communicator must learn to navigate.

The difficulty of conveying nuance in an attention-driven media environment means that every dataset becomes, in Wescott’s words, “a piece of content that needs extra explaining.” He emphasizes the need to balance brevity and depth: graphics and summaries must be instantly understandable but also linked to the methodological transparency that prevents distortion. In this sense, communicating data is both a technical and ethical exercise. The goal is to empower interpretation, not to manipulate it. This concern has grown more acute in an era when political data circulates rapidly through social media, where context is easily lost and partisanship colors perception. For Wescott, maintaining openness about how L2’s data are modeled (what is observed, what is inferred, and what remains uncertain) is essential for sustaining public trust in an age of polarization and information overload.

Artificial intelligence, meanwhile, is reshaping how these data are processed, verified, and interpreted. L2 has relied on machine learning techniques for years to clean and integrate massive datasets, but recent advances have expanded the role of AI from a background process to a strategic tool. “AI allows us to review and quality-check data at a scale that would be impossible for human operators,” Wescott notes. For instance, when updating precinct-level information across millions of records, AI can instantly detect discrepancies between L2’s data and official state sources, flagging inconsistencies that might take analysts hours, or days, to find manually. The same applies to client applications: campaign teams and researchers use AI-driven tools to code open-ended survey responses, turning unstructured opinions (“I don’t like this candidate” or “I might donate”) into quantifiable categories that can be mapped, analyzed, and even linked back to voter profiles. In short, AI has become a catalyst for transforming raw expression into structured political insight.

Yet, Wescott’s perspective is far from techno-utopian. He recognizes that AI introduces new responsibilities alongside new efficiencies. Algorithms can accelerate understanding, but they can also amplify bias or error if used carelessly. For him, the challenge is to integrate AI in ways that enhance, rather than obscure, the human judgment at the core of political analysis. “We’re just at the beginning,” he admits, suggesting that the future of political data will depend not only on computational power but also on communicative ethics, the ability to explain, contextualize, and humanize the outputs of increasingly complex systems. In that balance between automation and accountability, between speed and meaning, lies the next frontier of data-driven politics.

The crisis of trust in public opinion polls.

Public confidence in opinion polls has eroded over the past decade, and Wescott attributes this decline not to a loss of methodological rigor but to a profound transformation in how people communicate. “In the 1980s and 1990s, you could call landline numbers and actually reach respondents,” he notes. “People picked up the phone.” Today, the environment is radically different. The rise of cell phones, call blockers, and spam filters has made random sampling increasingly difficult. Text messaging briefly offered a solution, but even that channel has tightened as carriers restrict automated outreach. The result is a structural challenge: researchers must make more contacts than ever to reach fewer people, all while ensuring demographic balance and representativeness.

To adapt, the industry is becoming more inventive, sometimes even desperate. Wescott describes how polling firms now rely on hybrid strategies that combine traditional surveys with online panels, mail questionnaires, and direct incentives like digital gift cards. L2’s voter file data, often used as a sampling backbone for these panels, helps researchers cross-check and validate responses, integrating quantitative and qualitative inputs to improve reliability. Yet, the fundamental issue remains: the social fabric that once made polling viable, a public willing to answer questions, has frayed. As Wescott puts it, “Researchers are in a pickle.” Restoring trust, he suggests, will depend not only on better technology but on rebuilding the human connection between citizens and the institutions that seek to understand them.

The social impact of working with data.

When asked about the broader social implications of his work, Paul Wescott acknowledges that the influence of data in society is complex , a “mixed bag,” as he calls it. The United States, he explains, has developed a unique data ecosystem built around multiple layers of information. At L2, the focus is on structured, verifiable datasets that capture snapshots in time, voter files, consumer records, and demographic information that provide a stable foundation for research and civic engagement. This type of data allows scholars, journalists, and policymakers to better understand social dynamics and electoral behavior, offering a more complete picture of who Americans are and how they participate in democracy.

Wescott contrasts this with what he calls “the other kind of data”: the continuous streams collected by digital platforms through AI, apps, and online tracking. These systems , from Instagram listening for keywords to Amazon following users across the web, belong to a different universe, one defined by immediacy and behavioral targeting. “It’s almost like surveillance to a degree,” he notes. While those practices may shape consumer choices and political communication in real time, they also raise pressing questions about privacy, autonomy, and consent. For Wescott, the distinction matters because conflating all forms of data collection risks obscuring the more legitimate and transparent work done by research-driven organizations.

Ultimately, Wescott argues that the real measure of social impact lies in how data is used. L2’s datasets support academic and journalistic projects, from sociological studies of the electorate to policy evaluations, that contribute to a better-informed public sphere. Yet he also recognizes that the same infrastructure fuels political campaigning, which can overwhelm citizens with texts, emails, and mailers. “That’s not really a data issue,” he says. “That might just be a politics issue.” In his view, data itself is neutral, its social consequences depend on intent, transparency, and the capacity to channel information toward understanding rather than manipulation.

When data contradict beliefs.

One of the most delicate aspects of working with political data, Wescott admits, comes when evidence contradicts a client’s expectations or worldview. In today’s polarized environment, numbers are rarely received as neutral. “It’s particularly hard in this day and age because you have very divisive politics right now,” he says. People often read data through the lens of identity, not information. When a model categorizes someone as a “likely supporter” of a candidate they dislike, the reaction can be visceral, disbelief, denial, even anger. For Wescott, this tension illustrates a broader truth: data analysis doesn’t exist in a vacuum; it interacts with human emotion, identity, and bias.

Wescott recounts cases where predictive models misclassify individuals, for example, labeling a progressive living in a conservative suburb as a Trump supporter. “If you’re looking for a perfect dataset, good luck,” he jokes. “If you find it, let me know, I’d be happy to buy it.” Such outliers, however, don’t invalidate the broader reliability of the system. What they reveal is how easily people conflate individual accuracy with aggregate validity. For most users familiar with big data, small errors are understood as part of the probabilistic nature of modeling. But for others, especially older clients less accustomed to algorithmic reasoning, even minor discrepancies can erode trust.

This dynamic exposes a subtle but crucial skill in data work: the ability to communicate uncertainty without undermining credibility. As Wescott sees it, transparency is key. A model is not a judgment, but a probabilistic reflection of patterns in the data. Bridging that understanding requires more than technical precision, it requires empathy and clear communication. “That’s how big data works,” he says. “There are going to be inaccuracies, but as a whole, the data is solid and represents the electorate.” In an age where facts themselves can be contested, this may be the most essential role of data professionals: not just producing truth, but helping others learn how to recognize it.

The incresing role of consultants in governments.

In closing, Wescott reflects on the evolving relationship between campaigning and governing, two spheres that were once distinct but are now increasingly intertwined. “The margin between them has shrunk over the years,” he observes. The same tools used to persuade, mobilize, and engage voters during a campaign can be repurposed to inform, educate, and involve citizens once a candidate takes office. For Wescott, this is not about “electioneering” but about building continuity between political communication and public service. Data-driven engagement, he argues, can make governance more transparent, responsive, and participatory.

Consultants, in this view, have a crucial role to play in that process. By helping policymakers use the same analytics, segmentation, and outreach strategies that drive successful campaigns, they can enhance the connection between representatives and constituents. “Come to a town hall, sign up to get more information, be part of the process,” Wescott says. The challenge, and the opportunity, lie in turning the mechanics of persuasion into mechanisms of inclusion, transforming data not just into votes, but into dialogue.

Francisco M. Olivero

Leave a Comment

Your email address will not be published. Required fields are marked *