Principal BI Engineer - Data Modeling & Analytics Infrastructure

Remotely
Full-time

Key Responsibilities

- Design, develop, and maintain scalable data pipelines and ETL processes to ensure efficient data flow across systems.

- Architect and implement robust data warehouse solutions that support business intelligence and analytics initiatives.

- Create and optimize complex SQL queries to extract, transform, and analyze large datasets with high performance.

- Collaborate with cross-functional teams to understand data requirements and deliver appropriate technical solutions.

- Develop and maintain data models that accurately represent business processes and enable effective analytics.

- Implement data quality measures and monitoring systems to ensure data accuracy and reliability.

- Create compelling data visualizations and dashboards that effectively communicate insights to stakeholders.

- Apply statistical methodologies to identify trends, patterns, and anomalies in business data.

- Document data architecture, pipelines, and processes for knowledge sharing and continuity.

- Stay current with emerging technologies and best practices in data engineering and business intelligence.


Required Skills & Experience

- Proven experience (3+ years) as a Data Engineer, BI Developer, or similar role with demonstrated project success.

- Advanced SQL skills across various database systems, particularly MySQL and SQL Server.

- Strong understanding of data warehouse design concepts, including dimensional modeling and star schemas.

- Proficiency in designing and implementing ETL/ELT workflows for large-scale data processing.

- Experience with Python (preferred) or equivalent programming skills in Java or C# for data manipulation and analysis.

- Hands-on experience with data visualization tools and techniques to effectively present complex information.

- Solid foundation in statistics with the ability to apply statistical methods for data analysis.

- Knowledge of data quality management and governance principles.

- Experience working with cloud-based data platforms (AWS, Azure, or GCP).

- Excellent problem-solving abilities and attention to detail.


Nice to Have

- Experience with modern data stack technologies like Snowflake, BigQuery, or Redshift.

- Knowledge of streaming data processing using Kafka, Spark Streaming, or similar technologies.

- Familiarity with data orchestration tools such as Apache Airflow or Luigi.

- Experience with NoSQL databases like MongoDB, Cassandra, or DynamoDB.

- Understanding of data science workflows and machine learning pipelines.

- Experience with version control systems (Git) and CI/CD pipelines for data projects.

- Certifications related to data engineering, cloud platforms, or database management.

- Background in implementing data governance frameworks and data quality monitoring.

- Experience working in agile development environments.