nPloy Logo

Senior Data EngineerNewBarcelona, Spain

Logo of Workato

Workato

On-site

On-site

Regular employment

5 - 15 years of experience

Full Time

Barcelona, Spain

Responsibilities

About Workato

Workato transforms technology complexity into business opportunity. As the leader in enterprise orchestration, Workato helps businesses globally streamline operations by connecting data, processes, applications, and experiences. Its AI-powered platform enables teams to navigate complex workflows in real-time, driving efficiency and agility.

Trusted by a community of 400,000 global customers, Workato empowers organizations of every size to unlock new value and lead in today’s fast-changing world. Learn how Workato helps businesses of all sizes achieve more at workato.com.

Why join us?

Ultimately, Workato believes in fostering a flexible, trust-oriented culture that empowers everyone to take full ownership of their roles. We are driven by innovation and looking for team players who want to actively build our company. 

But, we also believe in balancing productivity with self-care. That’s why we offer all of our employees a vibrant and dynamic work environment along with a multitude of benefits they can enjoy inside and outside of their work lives. 

If this sounds right up your alley, please submit an application. We look forward to getting to know you!

Also, feel free to check out why:

  • Business Insider named us an “enterprise startup to bet your career on”

  • Forbes’ Cloud 100 recognized us as one of the top 100 private cloud companies in the world

  • Deloitte Tech Fast 500 ranked us as the 17th fastest growing tech company in the Bay Area, and 96th in North America

  • Quartz ranked us the #1 best company for remote workers

Responsibilities

At Workato, we’re redefining business automation by integrating innovative technologies that drive digital transformation. We’re seeking a highly skilled Senior Data Engineer to lead the design, development, and optimization of our modern data infrastructure. In this role, you will work extensively with advanced tools such as dbt, Automate DV, Trino, Snowflake, Apache Iceberg, and Apache Airflow to build robust, scalable, and efficient data pipelines that empower our decision-making and analytics capabilities.

You will work closely with data scientists, providing data vault for them, integrating models to the data vault, integrating different sources of the data to single data warehouse:

  • Product usage data

  • ETL data from AI services

  • Business data

  • External data

In this role, you will also be responsible to:

  • Data Pipeline Development:
    Design, develop, and maintain data pipelines and ETL processes using dbt and Apache Airflow to ensure seamless data integration, transformation, and validation across diverse data sources.

  • Data Infrastructure Management:
    Architect and implement scalable data solutions utilizing Snowflake as a data warehouse and leverage Trino for efficient query execution across distributed data sets.

  • Modern Data Technologies:
    Integrate and optimize data workflows using Automate DV and Apache Iceberg to manage data versioning, quality, and lifecycle, ensuring reliability and compliance.

  • Collaboration & Leadership:
    Work closely with data scientists, analysts, and business stakeholders to translate requirements into technical solutions. Mentor junior engineers and lead code reviews to promote best practices in data engineering.

  • Performance & Optimization:
    Continuously monitor, troubleshoot, and optimize data processes to ensure high performance, minimal downtime, and optimal resource utilization.

  • Innovation & Best Practices:
    Stay abreast of emerging trends in data engineering and automation, driving innovation and adopting new tools and techniques that enhance data processing and integration capabilities.

Requirements

Qualifications / Experience / Technical Skills

  • Education & Experience:

    • Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.

    • 5+ years of experience in data engineering, with a proven track record of designing and managing large-scale data infrastructures.

  • Technical Expertise:

    • Proficiency in dbt for data transformation and modeling.

    • Experience with Automate DV for data validation and workflow automation.

    • Hands-on expertise with Trino for distributed SQL query engines.

    • Deep understanding of Snowflake architecture and its ecosystem.

    • Knowledge of Apache Iceberg for managing large analytic datasets.

    • Strong background in orchestrating workflows using Apache Airflow.

    • Proficiency in SQL and at least one programming language (Python preferred).

  • Analytical & Problem-Solving Skills:

    • Ability to analyze complex data challenges and design innovative, data-driven solutions.

    • Strong debugging skills and attention to detail.

  • Soft Skills:

    • Excellent communication and collaboration skills.

    • Demonstrated leadership and mentoring capabilities.

    • Ability to thrive in a fast-paced, dynamic environment.

Preferred Qualifications

  • Familiarity with cloud data platforms (AWS, GCP, or Azure) and containerization technologies.

  • Experience in agile development methodologies.

  • Proven track record of working in automation-centric environments.

Required skills

Auto-CAD
Python
SQL
DBs development
Snowflake
Apache
English
Job posted today

or

to apply.