IBMposted 3 days ago
Hybrid - Bangalore, IN
Professional, Scientific, and Technical Services

About the position

In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience.

Responsibilities

  • Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements.
  • Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization.
  • Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too.

Requirements

  • Must have 5+ years experience in Big Data - Hadoop, Spark, Scala, Python.
  • Experience with HBase, Hive.
  • Good to have experience with AWS - S3, Athena, DynamoDB, Lambda, Jenkins, GIT.
  • Developed Python and PySpark programs for data analysis.
  • Good working experience with Python to develop Custom Framework for generating of rules.
  • Developed Python code to gather the data from HBase and designs the solution to implement using PySpark.
  • Experience with Apache Spark DataFrames/RDD's for business transformations and utilizing Hive Context objects for read/write operations.

Nice-to-haves

  • Understanding of DevOps.
  • Experience in building scalable end-to-end data ingestion and processing solutions.
  • Experience with object-oriented and/or functional programming languages, such as Python, Java, and Scala.
Hard Skills
Python
2
Git
1
IBM I
1
IBM WAS
1
Java
1
0bAfo zBmdLvVlu
0
16tWa 0KdjZDq7xpIg8WA
0
6cAUkitJ25Bqx0sy yxwNYeAGaXkc
0
6eE5Oj3Q9 dlPu4qDpZF
0
CmtrRION0 KLx2asJ
0
GRNeuc5TP rIDfNE0C8JsY
0
I3FgYxv 1O3UKkCnu7ds
0
IwGqsuHU5 bP5xvQMAjUds0
0
KhIa6 mpkFdLS
0
Kktmy Z04YCE2jL9
0
MqBh gY4IU
0
NmDIls8ZF tjCTGoW
0
SGWCzF
0
SZEs eIPv2
0
SpWHEh0K mfd54HuVWi
0
SsmiX FV9ZvGx
0
UL5iKCSyHMc 4gLMcvhOspT3
0
bzGQTpOZfrH HshXNVbYDKmWL
0
ecwGa TYEgiU2nymXr
0
fLv1wXIc
0
fuAI EiT3
0
hPH8mSna
0
hnEseOCqN WrAva5f0G2euLz9
0
j0mOwA 92ZjtnEWHCg
0
pAUiTq8R0 nZ0tcSAE1ICTF
0
r85AvfP
0
xzsBI cWbz7PNBHd4C
0
Soft Skills
CRAK0E g9ZLjX7l
0
Unlock 28 more keywords by signing up for Teal+Sign Up
Build your resume with AI

A Smarter and Faster Way to Build Your Resume

© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service