Available Offers for Cloud computing

Senior Machine Learning and Image Processing Specialist

Remotely

Project description

Looking for a highly skilled Senior Machine Learning and Image Processing Specialist to join our innovative team. This role requires extensive experience in graphics, including 3D graphics, and proficiency in NNs and ML approaches for graphics processing. The successful candidate will lead a team of developers working on cutting-edge projects.


Responsibilities:

- Lead the development team in designing and implementing ML algorithms for image and 3D graphics processing

- Develop and optimize image processing pipelines and 3D graphics algorithms

- Utilize NNs and advanced ML techniques to enhance graphics processing capabilities

- Collaborate with cross-functional teams to implement ML solution

- Conduct code reviews, provide technical mentorship, and ensure best practices in software development

- Stay updated with the latest advancements in machine learning, neural networks, and graphics processing


Must have Skills:

- Solid programming skills in C/C++ and Python

- Proven expertise in ML and image processing techniques

- Big experience with deep learning frameworks (e.g., TensorFlow, PyTorch).

- Extensive experience with 3D graphics and related technologies

- Good knowledge of mathematics and linear algebra

- Strong knowledge of NNs and their application in graphics processing

- Excellent leadership skills with experience in leading development teams

- Strong problem-solving abilities and the capability to work collaboratively in a team environment

- Proficiency in GPU Kernels and their implementation for algorithm optimization


Nice to have:

- Knowledge of computer vision techniques

- Familiarity with cloud platforms and their services related to ML and graphics processing

- Experience with software development best practices, including version control and CI/CD pipelines


Other:

  • - English: B2 Upper Intermediate

Lead Data Engineer

Remotely
Full-time

The project, a platform for creating and publishing content on social media using artificial intelligence tools, is looking for a Lead Data Engineer.


Responsibilities:

- Design, develop, and maintain robust and scalable data pipelines for collecting.

processing, and storing data from diverse social media sources and user interactions.

- Design of data warehouse.

- Implement rigorous data quality checks and validation processes to uphold the integrity.

accuracy, and reliability of social media data used by our AI models.

- Automate Extract, Transform, Load (ETL) processes to streamline data ingestion and transformation, reducing manual intervention and enhancing efficiency.

- Continuously monitor and optimize data pipelines to improve speed, reliability, and scalability, ensuring seamless operation of our AI Assistant.

- Collaborate closely with Data Scientists, ML Engineers, and cross-functional teams to understand data requirements and provide the necessary data infrastructure for model development and training.

- Enforce data governance practices, guaranteeing data privacy, security, and compliance with relevant regulations, including GDPR, in the context of social media data.

- Establish performance benchmarks and implement monitoring solutions to identify and address bottlenecks or anomalies in the data pipeline.

- Collaborate with data analysts and business teams to design interactive dashboards that enable data-driven decision-making.

- Develop and support data marts and dashboards that provide real-time insights into social media data.

- Stay updated with emerging data technologies, tools, and frameworks, evaluating their potential to improve data engineering processes.


Qualifications:

- Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field.

- Proven experience in data engineering, focusing on ETL processes, data pipeline development, and data quality assurance.

- Strong proficiency in programming languages such as Python, SQL and knowledge of data engineering libraries and frameworks.

- Experience with cloud-based data storage and processing solutions, such as AWS, Azure, or Google Cloud.

- Familiarity with DataOps principles and Agile methodologies.

- Excellent problem-solving skills and the ability to work collaboratively in a cross-functional team.

- Strong communication skills to convey technical concepts to non-technical stakeholders.

- Knowledge of data governance and data privacy regulations is a plus.

Chief Backend Developer

Remotely
Full-time
Project occupancy
Tasks: - Refinement and modification of existing system modules; - Development of new business and integration modules; - Development of new system modules for transition to microservice architecture with subsequent migration to the cloud (SpringBoot, Docker, OpenShift, Istio). - Upgrading the existing technology stack (Apache Kafka, Apache Ignite, ClickHouse, grpc+protobuf); - Optimization and refactoring of the current solution; - Participate in the elaboration and adoption of architectural decisions; - Participation in discussions on implementation of system enhancements; - Code review, development of unit tests. - Realization of integrations with AI models developed in the Bank. Expectations: - H/S. 5 years of experience; - Excellent knowledge of Java 8, design patterns and multithreading; - Understanding of current practices and approaches to implementing highly loaded systems and parallel computing; - Knowledge of J2EE, Spring; - Proficiency in JAX-RS, JPA, EJB; - Experience with Maven, Git; - Experience with ORM Hibernate/OpenJPA/Spring Data), XPath, JAXB, MQ. - Knowledge of SQL at the level of writing queries of medium complexity; - Would be a plus: - Experience with the following technologies: SOAP, JAX-WS, JTA, JMS, - Apache Kafka, Apache Ignite, ClickHouse, grpc+protobuf; - AS WildFly, SpringBoot, Docker, OpenShift/Kubernetes; - experience in projects on methodologies (Scrum, Kanban); - experience in writing unit-tests (in jUnit/TestNG); - experience working with Oracle (12), PostgreSQL databases; understanding of SQL query optimization principles; - ability to work with Linux command line, write minimal Bash scripts; - experience in Jira, Confluence, Jenkins; - knowledge of DevOps.

Java developer

Remotely
Full-time
Project occupancy
Tasks: - Refinement and modification of existing system modules; - Development of new business and integration modules; - Development of new system modules for transition to microservice architecture with subsequent migration to the cloud (SpringBoot, Docker, OpenShift, Istio); - Upgrading the existing technology stack (Apache Kafka, Apache Ignite, ClickHouse, grpc+protobuf); - Optimization and refactoring of the current solution; - Participate in the elaboration and adoption of architectural decisions; - Participation in discussions on implementation of system enhancements; - Code review, development of unit tests; - Realization of integrations with AI models developed in the Bank. What we expect from you: - Higher technical education; - 5 years of work experience; - Excellent knowledge of Java 8, design patterns and multithreading; - understanding of modern practices and approaches to the realization of highly loaded systems and parallel computing; - Knowledge of J2EE, Spring; - Proficiency in using JAX-RS, JPA, EJB technologies; - Experience with maven, git; - Experience with ORM (hibernate/openJPA/Spring Data), XPath, JAXB, MQ. - Knowledge of SQL at the level of writing queries of intermediate level of complexity; Will be a plus: - Experience with the following technologies: SOAP, JAX-WS, JTA, JMS, - experience with Apache Kafka, Apache Ignite, ClickHouse, grpc+protobuf; - experience with AS WildFly, SpringBoot, Docker, OpenShift/Kubernetes; - experience working in agile methodology projects (Scrum, Kanban); - experience in writing unit-tests (in jUnit/TestNG); - experience working with Oracle (12), PostgreSQL databases; understanding of SQL query optimization principles; - ability to work with linux command line, write minimal bash scripts. - experience in Jira, Confluence, Jenkins; - knowledge of DevOps.