Available Offers for ETL

Lead Data Engineer

Remotely
Full-time

The project, a platform for creating and publishing content on social media using artificial intelligence tools, is looking for a Lead Data Engineer.


Responsibilities:

- Design, develop, and maintain robust and scalable data pipelines for collecting.

processing, and storing data from diverse social media sources and user interactions.

- Design of data warehouse.

- Implement rigorous data quality checks and validation processes to uphold the integrity.

accuracy, and reliability of social media data used by our AI models.

- Automate Extract, Transform, Load (ETL) processes to streamline data ingestion and transformation, reducing manual intervention and enhancing efficiency.

- Continuously monitor and optimize data pipelines to improve speed, reliability, and scalability, ensuring seamless operation of our AI Assistant.

- Collaborate closely with Data Scientists, ML Engineers, and cross-functional teams to understand data requirements and provide the necessary data infrastructure for model development and training.

- Enforce data governance practices, guaranteeing data privacy, security, and compliance with relevant regulations, including GDPR, in the context of social media data.

- Establish performance benchmarks and implement monitoring solutions to identify and address bottlenecks or anomalies in the data pipeline.

- Collaborate with data analysts and business teams to design interactive dashboards that enable data-driven decision-making.

- Develop and support data marts and dashboards that provide real-time insights into social media data.

- Stay updated with emerging data technologies, tools, and frameworks, evaluating their potential to improve data engineering processes.


Qualifications:

- Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.

- Proven experience in data engineering, focusing on ETL processes, data pipeline development, and data quality assurance.

- Strong proficiency in programming languages such as Python, SQL and knowledge of data engineering libraries and frameworks.

- Experience with cloud-based data storage and processing solutions, such as AWS, Azure, or Google Cloud.

- Familiarity with DataOps principles and Agile methodologies.

- Excellent problem-solving skills and the ability to work collaboratively in a cross-functional team.

- Strong communication skills to convey technical concepts to non-technical stakeholders.

- Knowledge of data governance and data privacy regulations is a plus.

Data Engineer with Informatica

Remotely
Full-time

Project: IT service provider for the life science and healthcare industry.


Requirements:

  • Proficiency in administering Informatica PowerCenter or Informatica Cloud, including installation, configuration, and maintenance of the Informatica environment.
  • Knowledge of securing the Informatica environment, including user access management, authentication methods, and data encryption.
  • Proficiency in monitoring Informatica resources, managing server logs, and performing routine maintenance tasks to ensure smooth operations.
  • Experience with scripting languages (e.g. Python or Bash) and automation tools to automate administrative tasks and improve efficiency.
  • Experience in optimizing Informatica workflows, sessions, and mappings for better performance and efficiency.
  • Ability to diagnose and resolve issues related to Informatica workflows, data integration, connectivity, and performance.
  • Ability to handle change requests, coordinate deployments.
  • Knowledge of data quality management, including profiling, cleansing, standardization, and validation techniques.


Nice to have:

  • Familiarity with database concepts and SQL queries.
  • Experience with ETL processes and tools, including the ability to design and implement efficient data workflows.
  • Understanding of Informatica Cloud IDMC integration process with other applications and systems using APIs.
  • Knowledge of data warehousing concepts, such as data modeling, dimensional modeling, and star schemas.
  • Familiarity with big data technologies, such as Hadoop, Spark, and NoSQL databases.
  • Understanding DevOps methodologies and tools can help in managing and deploying Informatica Cloud IDMC pipelines and processes more efficiently.



Technical Product Manager at DWH/BI

Office
Remotely
Full-time

We are a leading back-office solution provider for Fintech companies that specializes in brokerage-related solutions. We have been operational for over ten years and are continually innovating on a wide range of products and services that serve the financial industry. 


Currently, we are looking for a Technical Product Manager with a strong IT background. You will be working as part of a highly talented team of IT and business specialists. The team’s top priority is to deliver new features and improve existing services.


You like:

  • A Challenge;
  • A stable environment to nurture your passion for technology;
  • A chance to grow as a specialist;
  • To work in an agile, fast-paced environment;
  • To deliver quality work on time and focus on business value;
  • Meaningful work that has a profound impact on the company and the industry.


About you:

  • Excellent communication skills. You communicate effectively across a disparate set of stakeholders with a range of technical skills and data literacy;
  • 2+ years experience as a product manager in an Agile environment, ideally in the Data Products and Business Intelligence space;
  • Exceptional project management skills. You get things done and more importantly, you get others to get their things done too;
  • Analytical and decision-making skills. You're able to diagnose and identify as well as formulate effective action plans to resolve issues;
  • Systems thinking: the ability to see how small details interrelate and impact the bigger picture;
  • Experience with data visualization software;
  • Strong knowledge of data warehousing and ETL fundamentals;
  • Experience writing SQL queries to explore and QA data.


As an advantage:

  • Hadoop and Spark experience;
  • Vertica experience;
  • Familiarity with Apache Airflow;
  • Basic understanding of statistics and statistical testing;
  • Familiarity with JIRA and Confluence;
  • Familiar with the finance domain.


Responsibilities:

  • Working directly with internal customers (at all levels) and engineers to identify data platform and visualization requirements and discover new areas of innovation;
  • Translating the needs of non-technical systems users into technical requirements;
  • Understanding and quantifying trade-offs to prioritize data and business intelligence engineering projects
  • Being an evangelist for data, data visualizations and self-service analytics
  • Writing and managing user stories in an agile environment; developing short and long-term strategic roadmaps;
  • Being a champion of data quality.


We offer:

  • Work in an international company;
  • Comprehensive health insurance, social guarantees;
  • Paid sports activities;
  • Fruits, cookies and great coffee;
  • Sponsored educational package;
  • Modern office environment in the center of Riga;
  • Friendly team and career growth opportunities;
  • Fully subsidized parking near the office after the probation period;
  • Remote work during the COVID-19 pandemic.