Apply to the open roles at Delphi Ventures' portfolio companies.

104
companies
231
Jobs

Data Scientist

Obol Technologies

Obol Technologies

Data Science
Lisbon, Portugal
Posted on Friday, August 25, 2023
Who Are We?
Obol Labs is a remote-first research and software development team focused on Proof of Stake infrastructure for public blockchain networks. Specific topics of focus are Internet Bonds, Distributed Validator Technology, and Multi-Operator Validation. The core team includes 28 members spread across +12 countries.
The core team is building the Obol Network, a protocol to foster trust-minimized staking through multi-operator validation. This will enable low-trust access to Ethereum staking yield, which can be used as a core building block in various Web3 products.
The Network
The network can be best visualized as a work layer that sits directly on top of the base layer consensus. This work layer is designed to provide the base layer with more resiliency and decentralization as it scales. In this chapter of Ethereum, we will move on to the next great scaling challenge, which is stake centralization. Layers like Obol are critical to the long-term viability and resiliency of public networks, especially networks like Ethereum.
Obol as a layer is focused on scaling main chain staking by providing permissionless access to Distributed Validators. The network utilizes a middleware implementation of Distributed Validator Technology (DVT), to enable the operation of distributed validator clusters that can preserve validators' current client and remote signing configurations.
Similar to how roll-up technology laid the foundation for L2 scaling implementations, we believe DVT will do the same for scaling the consensus layer while preserving decentralization. Staking infrastructure is entering its protocol phase of evolution, which must include trust-minimized staking networks that can be plugged into at scale. We believe DVT will evolve into a widely used primitive and will ensure the security, resiliency, and decentralization of public networks.
The Obol Network develops and maintains four core public goods that will eventually work together through circular economics:
The DV Launchpad, a User Interface for bootstrapping and managing Distributed Validators
Charon, a middleware Golang client that enables validators to run in a fault-tolerant, distributed manner
Obol Managers, a set of solidity libraries for the formation of Distributed Validators tailored to different use cases such as DeFi, Liquid Staking, and Fractionalized Deposits
Obol Testnets, a set of ongoing public incentivized testnets that enable any sized operator to test their deployment before serving for the Ethereum Main net
Sustainable Public Goods
Obol is inspired by previous work on Ethereum public goods and experimenting with circular economics. We believe that to unlock innovation in staking use cases, a credibly neutral layer must exist for innovation to flow and evolve vertically. Without this layer, highly available uptime will continue to be a moat.
The Obol Network will become an open, community-governed, self-sustaining project over the coming months and years. Together we will incentivize, build, and maintain distributed validator technology that makes public networks a more secure and resilient foundation to build on top of.
The Obol team is growing and is looking to make its first data-focused hire to support the team. This role would cover data-related challenges across the whole Obol team; spanning from gathering and servicing the data needs of our customers, working on data-focused product development, and developing software performance and benchmarking frameworks for the engineering team.
The main goal of this role is to improve both the collection and interpretation of data about the Obol Collective within the wider Ethereum Proof of Stake space.
It is expected for this role to work closely with Product, a Data Ops engineer, and the wider engineering team to define potential research questions, run experiments, and generate reports to answer these questions.

Responsibilities

  • Maintain reusable data analysis pipelines for analyzing distributed validator clusters under different testing environments, enabling the engineering team to fine tune Charon’s performance.
  • Work with customers to figure out how they can ingest on-chain and off-chain data about the performance of their DVs in a manner that suits their needs.
  • Work with product and engineering teams to incorporate real-time and historical aggregate data into Obol’s customer-facing products.
  • Improve our production data collection and analysis to identify unknown issues with Distributed Validators occurring on the network.
  • Develop guidelines for customers around the acceptable bounds of latency, bandwidth, and hardware requirements for DVs to perform optimally.
  • Work with SRE teams on their alerting and data analysis to improve their ability to notice issues in a DV before it goes offline.
  • Improve the software development life cycle through the use of A/B testing or other data-driven methods of quantifying the effectiveness of a given change to a codebase.
  • Work on data cleaning, and ensuring that what we collect is accurate and being analyzed correctly.
  • Potentially by cross-referencing different data sets off of one another.
  • Be able to support ad-hoc data-related requests from across the org, while also making forward progress on ongoing data projects.
  • Train the team to be able to leverage and interpret the data sources they have access to, reducing their dependence on the data team for producing analysis on demand.
  • Develop reports and dashboards outlining DV adoption/performance for internal and external consumption

Requirements

  • Be excited about data!
  • Work knowledge of one or more data tech stacks, for instance; Python, PostgreSQL for raw data, Spark Jobs for processing data, and Jupyter Notebooks for analysis.
  • Demonstrable understanding of relational databases
  • Working knowledge of the different approaches to ETL/ELT processes for data analysis
  • Professional level of understanding of statistics and probability.
  • Good communication skills that would allow cross-functional discussions and explanations.
  • Willingness to work in a distributed remote team.

Nice to have

  • Previous experience in Web3
  • Knowledge of proof of stake Ethereum
  • Familiarity with the issues encountered when running servers

Benefits

  • Location: Fully Remote Working
  • Offsites: Annual global offsite
  • Generous paid time off
  • Personal equipment & training budget
Thank you for your interest. Looking forward to building amazing stuff together!