Integration Product Architect

Details of the offer

Required Core Skills: • Bachelor's degree in computer science, Engineering, or a related field.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Strong proficiency in Java and Spring Boot.
• Experience with Apache Kafka and stream processing.
• Familiarity with Big Data technologies (Hadoop, Spark, etc.).
• Knowledge of NoSQL databases (e.g., Druid, Cassandra, MongoDB).
• Understanding of distributed systems and scalability.
• Design, develop, and implement Kafka-based Microservices using Spring Boot.
• Build data pipelines for ingesting, processing, and analysing large-scale data sets.
• Optimize Kafka configurations for performance and reliability.
• Work with Big Data technologies such as Hadoop, Spark, and NoSQL databases.
• Ensure data security, integrity, and compliance with industry standards.
• Troubleshoot and resolve issues related to Kafka topics, consumers, and producers.
• Monitor system performance and proactively address bottlenecks.
• Participate in code reviews and mentor junior developers.
Nice to have skills: • Certification in Kafka or related technologies.
• Experience with cloud platforms (AWS, Azure, GCP).
• Knowledge of containerization (Docker, Kubernetes) Detailed Job Description: ·      Minimum years of experience: 12 years • Bachelor's degree in computer science, Engineering, or a related field.
• Strong proficiency in Java and Spring Boot.
• Experience with Apache Kafka and stream processing.
• Familiarity with Big Data technologies (Hadoop, Spark, etc.).
• Knowledge of NoSQL databases (e.g., Druid, Cassandra, MongoDB).
• Understanding of distributed systems and scalability.
• Excellent problem-solving skills and attention to detail.
• Effective communication and teamwork abilities.
• Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions.
• Design, develop, and implement Kafka-based microservices using Spring Boot.
• Build data pipelines for ingesting, processing, and analyzing large-scale data sets.
• Optimize Kafka configurations for performance and reliability.
• Work with Big Data technologies such as Hadoop, Spark, and NoSQL databases.
• Ensure data security, integrity, and compliance with industry standards.
• Troubleshoot and resolve issues related to Kafka topics, consumers, and producers.
• Monitor system performance and proactively address bottlenecks.
• Participate in code reviews and mentor junior developers


Nominal Salary: To be agreed

Source: Talent_Ppc

Requirements

Oracle Financials, Support Specialist

Description: We are seeking a talented individual to join our CIS (Corporate Information Solutions) Autonomous Finance Team team at Marsh.This role will be b...


Gb001 Marsh Ltd - England

Published 10 days ago

Servicenow Grc Developer

Our client is seeking an experienced ServiceNow GRC Developer for a 12-month contract. This role offers an exciting opportunity to lead critical Governance, ...


Linking Humans - England

Published 11 days ago

Staff Software Engineer

THE CARWOW GROUP Carwow Group is driven by a passion for getting people into cars. But not just any car, the right car. That's why we are building the go-to ...


Carwow - England

Published 10 days ago

Lead Configuration Analyst

At Bionic, we're making life radically easier for small business owners. We're building a one-stop shop for business essentials that's powered by smart techn...


Bionic Services Ltd - England

Published 10 days ago

Built at: 2024-11-24T19:17:13.389Z