Senior Analytics Engineer
Airbnb is a mission-driven company dedicated to helping create a world where anyone can belong anywhere. It takes a unified team committed to our core values to achieve this goal. Airbnb's various functions embody the company's innovative spirit and our fast-moving team is committed to leading as a 21st century company.
Founded in August of 2008 and based in San Francisco, California, Airbnb is a trusted community marketplace for people to list, discover, and book unique travel experiences around the world. Whether an apartment for a night, a castle for a week or a villa for a month,Airbnb allows people to Belong Anywhere through unique travel experiences at any price point,in more than 85,000 cities and over 191 countries. We promote a culture of curiosity, humanity,and creativity through our product, brand, and, most importantly, our people.
About the Team
The Airbnb Payments Engineering team mission is to create a world-class Payments Platform and economically empower the global Airbnb community. We are
looking for Analytics Engineers to join our team. We own and curate payments data models, data resources, and metrics that are critical to our business. In this role you will own some of Airbnb’s most critical data assets.
About the Position
Analytics Engineers build the data foundation for reporting, analysis, experimentation, and machine learning. We are looking for someone with expertise in metric development, data modeling, SQL, Python, and large-scale distributed data processing frameworks like Presto or Spark. Using these tools, along with first-class internal data tooling, you will transform data from data warehouse tables into critical data artifacts that power impactful analytic use cases (e.g. metrics, dashboards) and empower downstream data consumers. As an Analytics Engineer, you will sit at the intersection of data science , Product analytics and data engineering, and work collaboratively to achieve highly impactful outcomes.
Data can transform how a company operates; high data quality and tooling is the biggest lever to achieving that transformation. You will make that happen.
- Understand data needs by interfacing with fellow Analytics Engineers, Data Scientists, Data Engineers, and Business Partners
- Architect, build, and launch efficient & reliable data models and pipelines in partnership with Data Engineering
- Design and implement metrics and dimensions to enable analysis and predictive modeling
- Design and develop dashboards or other data resources to enable self-serve data consumption
- Build tools for auditing, error logging, and validating data tables
- Define logging needs in partnership with Data Engineering
- Define and share best practices on metric, dimension, and data model development for analytics use
- Build and improve data tooling in partnership with Data Platform teams
- Be a technical expert on data model usage
- Own and review code changes to certified metric and dimension definitions
- Manage communication of data model updates and changes across organization
- Ensure data models are fully documented, and metrics and dimensions have clear descriptions and metadata
- Passion for high data quality and scaling data science work
- 6+ years of relevant industry experience
- Strong skills in SQL and distributed system optimization (e.g. Spark, Presto, Hive)
- Experience in schema design and dimensional data modeling
- Experience in at least one programming language for data analysis (e.g. Python, R)
- Proven ability to succeed in both collaborative and independent work environments
- Detail-oriented and excited to learn new skills and tools
- Effective story-telling & articulation skills – ability to convert analytical output into clear, concise, and persuasive insights & recommendations for technical & non-technical audience
- Strong influence and relationship management skills
- Experience with an ETL framework like Airflow
- Python, Scala, Superset, and Tableau skills preferred
- An eye for design when it comes to dashboards and visualization tools
- Familiarity with experimentation and machine learning technique