Senior Data Analyst

Join us as a Senior Data Analyst

At Slyce, we are revolutionizing food delivery operations with our globally unique technology. Many of the world’s largest restaurant brands already rely on our SaaS solutions.

We combine cutting-edge tech with deep industry knowledge to help restaurants boost their reach, increase revenues, and grow customer loyalty.

Our founders have a proven track record — building and selling the world's largest virtual restaurant brand, Honest Food, to Delivery Hero.  If you want to be part of building the next big thing in food-tech from scratch, this is your chance.

Your Role

We’re looking for a Senior Data Analyst who empowers our C‑level to make data-driven decisions every day. You will report to the Chief Data Officer and will be part of the data team. In your day-to-day work, you will work closely with the CEO and COO and dive deep into the performance of our algorithms for our clients.

You will help the business prepare case studies to win trials and help us understand what works and what does not. You will work closely with Data Engineering and Data Science to improve the performance of our decision-making.

This is a great opportunity to work closely with the founding team in a fast-growing startup

Your Responsibilities

  • Own performance analytics
  • Analyze how our algorithms impact restaurant performance (orders, revenue, margin, marketing spend) and define the KPIs that matter.
  • Partner with C‑level
  • Work directly with the CEO, COO, and CDO to answer strategic questions, prioritize opportunities, and support key decisions with data.
  • Build case studies & narratives
  • Create clear, compelling case studies and ROI analyses to support sales, customer success, and marketing.
  • Design metrics & dashboards
  • Define core metrics and build dashboards that give the business a real-time view of performance across markets, products, and cohorts.
  • Collaborate with Data Science & Engineering
  • Work with DS/DE to evaluate model performance, define experiments, and translate business needs into data and modeling requirements.
  • Champion data quality & self‑service
  • Help improve our data models, documentation, and tooling so that stakeholders can reliably self‑serve for routine questions

Your Profile

  • Experience: 3–6 years of experience in data analytics, product analytics, or analytics engineering.
  • SQL & warehousing: Strong proficiency in SQL and experience working with large datasets in a cloud data warehouse (BigQuery or similar).
  • Analytics & statistics: Solid understanding of analytic methods and experimentation (A/B testing, cohort analysis, basic statistical inference).
  • Python for analysis: Strong proficiency in Python for data processing and automation (e.g. pandas, Jupyter/Colab, simple scripts).
  • Data modeling & transformations: Experience working with well-structured data models; comfortable reading and contributing to dbt-based transformations.
  • Workflow literacy: Hands-on experience with modern data workflows and orchestration (e.g. Airflow or Cloud Composer), at least as a power user.
  • BI & visualization: Experience building reports and dashboards in a BI tool (e.g. Looker, Tableau, Metabase, Mode, Power BI).
  • Stakeholder communication: Strong communication skills, especially in explaining complex analyses to non-technical stakeholders and C‑level.
  • Solution-oriented Mindset: You know how to prioritize and find solutions that are appropriate for the available time.
  • Collaboration: Ability to work both independently and in cross-functional teams, proactively driving topics to completion.
  • Work authorization: EU work authorization required (we do not sponsor visas at this time).

Nice to Have Skills

  • Domain knowledge: Experience with marketplaces, food delivery, or digital marketing/advertising.
  • ML product analytics: Familiarity with evaluating and monitoring ML-driven products (recommendation, bidding, optimization).
  • GCP experience: Previous experience with GCP (BigQuery, Cloud Composer, Pub/Sub, Vertex AI) or GCP Professional Data Engineer certification.
  • Data quality: Understanding of data quality frameworks (e.g. Great Expectations, Soda) and how to implement basic checks.
  • Streaming & real-time: Experience with streaming or event-based data (Pub/Sub, Kafka, Dataflow, or similar).
  • APIs & integrations: Experience working with RESTful APIs and third-party platform integrations.
  • Infrastructure as Code: Knowledge of tools like Terraform or Pulumi is a plus, especially if you’ve worked closely with data infrastructure.

Our Tech Stack

  • Data & ML: Cloud Composer (Airflow), dbt, Python, BigQuery, Cloud Storage, Pub/Sub, Vertex AI
  • Other: TypeScript, Docker, Kubernetes, Pulumi, GCP

What We Offer

  • Impact: Join early and shape a platform used by thousands of restaurants across Europe.
  • Technology: Build scalable, ML-powered infrastructure using modern data tools on GCP.
  • Hybrid work: Flexible balance between our Berlin office and home office.
  • Leadership access: Work directly with our founders and senior engineering team.
  • Time off: 27 days vacation plus German public holidays.

Are You Ready to Build the Future with Us?

Send your CV and a few lines on why you are the perfect fit for Slyce to  alvaro@slyce.io.

Let’s create something extraordinary — together