SUMMARY:
- 7+ years of experience working with distributed systems.
- Expertise in developing ETL pipelines using data parallelism, SQL and Python programming.
- 7+ years of ETL developer experience with expertise in Ab initio and data warehousing.
- Strong programming skills in Python and UNIX shell scripting.
- Well versed with Apache Drill, Amazon Redshift and Cloudera Impala.
- Expertise in Oracle, Teradata and Confidential DB2.
- Experience of handling huge data with more than 700M records.
- Hands on experience of Hadoop ecosystem, Apache Spark, Apache Drill.
- Product management experience at Confidential Technologies for Apache Drill.
- Experience of working in an Agile environment (scrum methodology).
- Expertise working on Ab initio suite including GDE, EME, BRE.
- Strong development experience with Ab initio components and graph building techniques (building ETL pipeline).
- Expertise in various Ab initio concepts like partitioning and parallelism, vectors, generic graphs, psets, etc.
- Strong communication and presentation skills.
- Strong analytical and problem - solving skills.
TECHNICAL SKILLS:
Programming Languages: Python, Python Libraries (NumPy, Pandas, Matplotlib, Scikit learn), JAVA
Big data tools: Apache Drill, Apache Spark, Amazon Redshift, Cloudera Impala.
ETL Tools: Ab Initio Co Operating systems versions 3.1.5.2, Ab initio EME, Ab initio Conduct>It, Confidential DataStage 8.5
RDBM Systems: Teradata, Teradata Workload Management, Confidential DB2, Oracle
Operating systems: AIX UNIX, Mac OS X, Windows
Scheduling tool: Confidential Tivoli JS Console, UNIX Shell scripting, Confidential Watson Analytics, Business Process Modeling, Product Management, Proficient in Microsoft Office Suite
PROFESSIONAL EXPERIENCE:
Confidential
ETL Developer
Tools: Ab initio GDE, Ab initio Conduct>It, UNIX, Microsoft SQL Server
Responsibilities:
- Working as a developer in the team.
- Developing Ab initio graphs using optimization techniques.
- Handling around 700M records.
- Interacting with business analysts and product owners to design product features.
Confidential, Chicago IL
ETL Developer
Tools: used: Ab initio GDE, Ab initio EME, Ab initio BRE, UNIX, Oracle, Apache Spark
Responsibilities:
- Onsite coordinator with two offshore resources.
- Developed graphs using optimization techniques in AbInitio
- Working in Enterprise Rewards domain in Confidential .
- Handling around 200M records in processing.
- Led the setup of Production assurance environment support to the developed application.
- Requirements Gathering/analysis, Functional and Technical Design.
- Developed shell scripts to automate various processes to ease the support activity
- Working in agile environment with Kanban approach.
Confidential, San Jose, CA
Product Management Intern
Tools: used: Apache Drill, Apache Spark, AWS Redshift, Cloudera Impala, Amazon Kinesis analytics
Responsibilities:
- Trained on Apache Spark, Apache Drill and Confidential Converged Data Platform (Hadoop Platform).
- Brainstorming with engineering team for requirement and impact analysis.
- Technical research on improving resource management feature in Apache Drill.
- Technical research on Impala Admission Control and Amazon RedShift Workload Management.
- Frequent customer interaction for requirement gathering and product demonstration.
- Research to integrate Apache Drill with Confidential Streams.
- Create Product Requirement Document.
- Technical and Market research for Streaming SQL products with a focus on Amazon Kinesis Analytics.
Confidential
ETL Applications Developer /Associate
Tools: used: Ab initio v3.1.5.2, Teradata, AIX UNIX
Responsibilities:
- Responsible for requirement analysis, technical design of application, development of Ab initio jobs and unit testing, coordination with QA team in different execution cycles and fixing production issues.
- Built subscription components for making generic AbInitio Graphs to be used across the project.
- Highly used AbInitio components like, Reformat, Rollup, Scan, Merge, Lookup, Partition Components, Gather and Merge components, normalize, denormalize, etc. for transformations.
- Extracted data sources with millions of records.
- Handled all tasks single handedly being the sole member in the team.
- Handled data volume of around 200M records.
Confidential
ETL Applications Developer/Senior Systems Engineer
Tools: used: Ab initio v3.0.0.4, Confidential DataStage 8.5, Confidential DB2, UNIX
Responsibilities:
- Creation of new framework, which included building wrapper scripts and designing metadata table structures.
- Development of Ab initio and Confidential Datastage jobs and unit testing.
- Handled data volume of around 300M records.
- Writing UNIX shell scripts for automation.
- Worked on AbInitio components like, Reformat, Rollup, Scan, Merge, Lookup, Partition Components, Gather and Merge components, etc.
- Responsible for training new employees in the team on the ETL tools and UNIX.
- Responsible for converting Ab initio jobs from version 2.7 to 3.0.
- Data with millions to billions of records was used for processing.
- Highly used Ab initio components like Rollup, scan, normalize, denormalize sort, etc. for transformations.
- Worked on integration of Ab initio with C/C++ in consistent environment
- Provided support in debugging technical issues.