DBOI Global Services Pvt. Ltd.

Deutsche Bank Operations International (DBOI) Global Services, a subsidiary of Deutsche Bank, is the bank's global processing arm. Established with the purpose of building a world- class operating infrastructure for Deutsche Bank's global businesses, DBOI Global Services is an integrated network of six processing centers of excellence in the UK, USA, India and Philippines, delivering 24/ 7 support for the bank and its clients

Big Data and Cloud Technologies - AVP


This role is for a developer with strong core application or system programming skills in Scala, java and good exposure to concepts and/or technology across the broader spectrum. XCM domain covers a variety of existing systems and green-field projects. A Full stack Hadoop development experience with Scala development. A Full stack Java development experience covering Core Java (including JDK 1.8) and good understanding of design patterns.

What we?ll offer you

A healthy, engaged and well-supported workforce are better equipped to do their best work and, more importantly, enjoy their lives inside and outside the workplace. That?s why we are committed to providing an environment with your development and wellbeing at its centre.

Your key responsibilities

Should be willing to work in any technology
7-12 years combined experience as software developer
Degree holder in numerate subject
Hands on Big Data experience in developing enterprise level applications usingĀ  Hadoop, Big Data, Map Reduce & NoSQL solutions
Experience in Apache Spark with in depth knowledge of streaming and integration API?s
Able to develop solutions using Java, Scala with Spark Framework
Cloud background would be preferable
Experience in HBase / HDFS / Hive / Impala / Kafka
Good understanding of data format Avro, Parquet
Proficiency in Java related frameworks like Springs, Hibernate, JPA
Strong hands-on development track record with end to end development cycle involvement
Good exposure to computational concepts
Good communication and interpersonal skills

Proficiency in data modelling.

Your skills and experience

Strong hands on development in Hadoop technologies like Spark, Scala and experience on Avro.
Participation in product feature design and documentation
Requirement break-up, ownership and implantation.
Product BAU deliveries and Level 3 production defects fixes
Should be willing to work in any technology
7-12 years combined experience as software developer
Degree holder in numerate subject
Hands on Big Data experience in developing enterprise level applications usingĀ  Hadoop, Big Data, Map Reduce & NoSQL solutions
Experience in Apache Spark with in depth knowledge of streaming and integration API?s
Able to develop solutions using Java, Scala with Spark Framework
Cloud background would be preferable
Experience in HBase / HDFS / Hive / Impala / Kafka
Good understanding of data format Avro, Parquet
Proficiency in Java related frameworks like Springs, Hibernate, JPA
Strong hands-on development track record with end to end development cycle involvement
Good exposure to computational concepts
Good communication and interpersonal skills
Proficiency in data modelling
Understanding of middleware?s like MQ / TIBCO is an advantage.
Understanding of NoSQL is an added advantage
Experience of Data Analytics platforms is advantageous.
Banking experience

Qualification: Degree holder in numerate subject

Industry: Information Technology and Services

Functional Area: IT - Software

Experience: 7 - 12 years

Location: Pune (Maharashtra, India)


Job Ad publication date: 7 Sep 2020