J.P. Morgan Off Campus drive 2020 | Hiring Experienced Professionals AS Data Engineer l

Published by hitman on


J.P. Morgan is hiring for the position of Software Engineering- Data Engineer l. For insights, practice materials and more such updates related to off-campus drives and internship drives be active on our website.

J.P. Morgan Wealth Management is a leading asset manager of choice for institutions, financial intermediaries and individual investors, worldwide. JP Morgan Wealth management is undergoing digital transformation with the aim of bringing in best of breed digital technologies. This will enable regions and markets to meet their digital ambition, enhance communication and information dissemination to clients, engage in thought leadership & client education and most importantly play a pivotal role in influencing the sales funnel becoming a data driven organization.

JP Morgan Wealth Management Digital Technology Team is looking for a highly motivated hands-on technology engineer to build, lead and deliver data platform capabilities using some of the latest technologies available. As a data platform engineer, your mission is to work closely with our team of innovators, marketers, data scientists and technologists toward creating next-level solutions that improve the way our business is run making it data driven organization. Your deep knowledge of design, analytics, development, coding, testing and application programming will help team raise their game, meeting your standards, as well as satisfying both business and functional requirements. Your expertise in various technology domains will be counted on to set strategic direction and solve complex and mission critical problems, internally and externally. Your quest to embracing leading-edge technologies and methodologies inspires your team to follow suit. And best of all, you’ll be able to harness massive amounts of brainpower through our global network of technologists from around the world.

LocationBengaluru, Karnataka, India

Ideal candidate will have to undertake following responsibilities:

· Design and implement efficient data pipelines in order to integrate data from a variety of sources into WM Data Warehouse.

· Provide clear relationships and flows between different marketing technologies from a customer lifecycle standpoint Delivery of analysis and reporting/ data visualization.

· Implement data delivery solutions – data design, ETL procedures, end-to-end data process management.

· Align deliverables with enterprise data solutions and infrastructure controls.

· Communicate data delivery solutions to functional partners across multiple levels of organization.

· Develop data ingestion scripts, data joins to create data visualization dashboards.

· Automate data pipelines using Airflow.

· Provide support for investigating and troubleshooting production issues.

· Be expert level pipeline developer using Pyspark.

· Experience in developing dashboard websites using Flask or other frame works.

· Accountable for integrity in data reporting and interpretation

This role requires a wide variety of strengths and capabilities, including:

· BS/BA degree or equivalent experience

· Hands-on experience in engineering data solutions.

· Proficient with data management, reporting and analytical methods.

· Thorough understanding of relational database and data management solutions.

· Technical understanding of data extraction, transformation and load processing (ETL)

· Strong python, UNIX scripting and SQL experience is a must.

· Experience building Data Lake/DM using Cloudera or Hortonworks distributions.

· Strong experience in HDFS, MapReduce, Yarn, Hive, IMPALA, Airflow and Sqoop.

· Extensive experience in Spark leveraging Python.

· Able to tune big data solutions to improve performance.

· Expertise in Data governance and Data Quality.

· Experience with DevOps (CI/CD) pipelines and release management for data engineering.

· Extensive knowledge in data engineering, relational database implementation, Enterprise Data Warehouse (e.g. Google Big Query), Distributed Filesystems (Hadoop), ETL/ELT procedures (Extract, Transform Load) operations and/or technical sales experience.

· Proven understanding of Analytics platforms (ideally certified as GAIQ), Google Analytics 360 and Firebase.

· Advanced use of data visualization, reporting and dashboard tools (e.g. Google Data Studio, Looker, Google Sheets, Excel, Power BI, Tableau)

Preferred Knowledge and Experience:

§ Knowledge of Digital Marketing, online acquisition channels (PPC, SEO, email / CRM) and the importance of attribution

§ Experience with executive speaking and presentation skills formal presentations, whiteboarding, large and small group presentations.

§ Experience with Technical web services development/deployment, IT systems and network engineering experience, security and compliance, etc.

§ Understanding of Agile methodologies, and the ability to apply these practices to analytics projects.

§ Expert understanding of the SDLC (Software Development Life Cycle) process including Agile and Waterfall

§ Knowledge of Java 8 and hands-on Java knowledge is preferred.

§ Knowledge of NoSQL Databases such as Cassandra, HBase, MongoDB, Dynamo DB, and Elastic Search is a plus.

§ exposure to big data projects using Kafka, Cassandra , and Apache NIFI related stack on premise or cloud