Airbnb is a mission-driven company dedicated to helping create a world where anyone can belong anywhere. It takes a unified team committed to our core values to achieve this goal. Airbnb's various functions embody the company's innovative spirit and our fast-moving team is committed to leading as a 21st century company.

About the Team

We are looking for a Senior Analytics Engineer to join the Commercial Products team, a product and platform team. On the product front, our team is responsible for helping hosts reach their financial goals by helping hosts set prices, manage their calendar, and act on actionable insights we provide. On the platform front, we are responsible for providing Airbnb with actionable insights to improve our marketplace and community. Our team’s data and models are used ubiquitously across the company with dozens of downstream use cases.

About the Position

Analytics Engineers build the data foundation for reporting, analysis, experimentation, and machine learning at Airbnb. We are looking for someone with expertise in metric development, data modeling, SQL, Python, and large scale distributed data processing frameworks like Presto or Spark. Using these tools, you will transform data from data warehouse tables into valuable data artifacts that power impactful analytic use cases (e.g. metrics, dashboards). You will sit at the intersection of data science and data engineering, and work collaboratively to achieve highly impactful outcomes. Data can transform how a company operates; high data quality and tooling is the biggest lever to achieving that transformation. You will make that happen.

You will be working closely with data scientists, data engineers and full stack engineers to architect new data models, design logging schemas, and build metrics, to name a few. Specifically, you will be responsible for building data models and metrics to help us understand how hosts engage with pricing and calendar tools, the key to host’s success on Airbnb!

Responsibilities:

  • Define requirements and schemas for logging backend and frontend events in our calendar display and pricing tools
  • Define and improve standardized metrics for understand pricing and availability for listings on our platform
  • Conduct metrics validation and implement monitoring for regressions
  • Build dashboards to communicate metric trends to various stakeholders
  • Conduct product analyses and leverage insights to inform and shape team roadmap

Minimum Qualifications:

  • Passion for high data quality and scaling data science work
  • 5+ years of relevant industry experience
  • Strong communication skills
  • Strong skills in SQL and distributed system optimization (e.g. Spark, Presto, Hadoop,
  • Hive)
  • Expert in at least one programming language for data analysis (e.g. Python, R)
  • Experience in schema design and dimensional data modeling
  • Ability to perform basic statistical analysis to inform business decisions
  • Proven ability to succeed in both collaborative and independent work environments
  • Detail oriented and excited to learn new skills and tools

Preferred Qualifications:

  • Experience with an ETL framework like Airflow
  • Python, Scala, Superset, and Tableau skills preferred
  • An eye for design when it comes to dashboards and visualization tools
  • Familiarity with experimentation and machine learning techniques

 

We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.