A

Data Engineer

Air-tek
Full-time
On-site
Toronto, Canada
AI
Air-tek is a Canadian-based software company with a powerful suite of unique products that has already achieved a significant share of a huge global opportunity. The product market fit is excellent, and customers are lining up to buy. Although our global customers know us, we intentionally operate in stealth mode during this growth phase.

Our diverse team shares a collective passion for solving complex problems with a drive to innovate and a desire to create the passenger-centric travel industry. Based in Toronto, our inclusive culture is built on trust, collaboration, delivering a great product, and continuous personal development. We love what we do, and we support the team around us.

In this role, you will have the opportunity to help build our Data Engineering organization from scratch. As a Data Engineer, you’ll design and build our data platform, pipelines, and integrations, support data products and machine learning initiatives. You will collaborate with leaders across the company to create unique data infrastructure, run tests on their designs to isolate errors, and updating systems to accommodate changes in company needs.

Key Responsibilities:

    • Design and develop scalable data pipelines and systems to support data integration, transformation and loading (ETL).
    • Collaborate with stakeholders to understand data requirements and deliver solutions.
    • Implementing methods to improve data reliability and quality.
    • Combining raw information from different sources to create consistent and machine-readable formats.
    • Developing and testing architectures for data extraction and transformation.
    • Building and maintaining optimal data pipeline architectures.
    • Assembling large, complex data sets.

Basic Qualifications:

    • 2-3 years of work experience in a similar role
    • Bachelor's degree in computer science, engineering, or a related field.
    • Experience with data warehousing, ETL processes, and data modeling.
    • Proficiency in SQL and programming languages such as Python or Java.
    • Familiarity with big data technologies (e.g., Hadoop, Spark) and cloud platforms (e.g., AWS, Azure, GCP).
    • Strong problem-solving and analytical skills.