Etl Production Support Resume
4.00/5 (Submit Your Rating)
SUMMARY
- Data engineer with hands on experience in designing, developing, testing, and maintaining enterprise wide Datawarehouse, Data Integration, Data Migration Projects.
- Expert in implementing efficient ETL pipelines integrating various applications over past 4+ years.
- Possess good understanding of SDLC, STLC and Agile methodologies.
- Ability to go through the User stories and understand the technical aspects of ETL/Data process.
- Ability to write and understand complex SQL queries.
- Good Understanding of IBM TWS scheduler along with UNIX shell script.
- Proficient in analyzing and translating business requirements to technical requirements and architecture.
- Strong database skills, ETL Solution Design and implementation knowledge using IBM DataStage.
- Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data Conversion, Performance Tuning, System Testing.
- Expertise in creating reusable components like generic extract job, load jobs and validation jobs.
- Expertise in in corporation of various data sources like Oracle, DB2, Teradata and Flat files into the staging area.
- Experience in UNIX Shell Scripting.
- Experience in using and writing sql in Oracle, DB, Teradata & SQL Server databases.
- Proven track record in troubleshooting of DataStage jobs and addressing production issues like performance tuning, bug fixes and enhancements.
- Hands on exposure to Hadoop, Hive and spark using Scala.
- Good communication skills, interpersonal skills, self - motivated, quick learner, team player.
TECHNICAL SKILLS
Languages: SQL, Unix Shell Scripting, Spark
Databases: Oracle, SQL Server
Tools: IBM Datastage, TWS, Autosys, Jenkins, CA Agile Central, FACETS
Concepts: ETL, Hadoop, CI/CD, Agile
Operating System: Windows, Linux
PROFESSIONAL EXPERIENCE
Confidential
ETL Production Support
Responsibilities:
- Working as technical lead for implementing ETL capabilities to complement Healthcare administrative application - Facets, to ensure all business logics / adjudication rules are applied and Federal and state compliances are in place.
- Work as ETL SME for Client management team to help provide direction on current and future state of applications. Analyze and provide information for an informed decision making.
- Develop, Test and Deploy jobs in DataStage, Talend and Spark Scala.
- Monitor and evaluate the applications with 3000+ jobs including DataStage, Talend and Spark Scala to identify performance issue in jobs and work on fine tuning them.
- Analyze, Identify, and modify the current system to cut back on charge backs wherever possible.
- Worked on migrating existing Mainframe Interfaces by designing and developing new extracts in Hadoop (Mapr) using Spark Scala, Hive, Kafka.
- Worked on initial server and Project set up for Spark, prepared coding standards document to be followed.
- Develop Server/Application monitoring scripts to ensure notifications are sent to Support teams in case of over utilization of server resources like memory/Space, Kafka Certificate renewal, Bad Job identification.
- Review and enforce industry standards and framework design for new technology adoptions like Talend, Kafka.
Confidential
DataStage Developer
Responsibilities:
- Effort estimations.
- Created HLD & LLD documents.
- Created generic Data stage jobs and sequences. Review Job designs created by team.
- Enforce standards among all users of application to ensure uniformity.
- Develop generic jobs that handle type 2 changes in dimensions from source to stage 1(staging area) and stage 2(warehouse).
- Performance tuned the jobs to cut the Extract, Transform and Load time significantly.
- Work with DataStage Admins in tuning the load times during peak hours.
- Conducted design review, code review & test result review with client/Business team.
- Worked with program team to get the signoffs for the project deployment.
- Created the deployment, Rollback & Post Deployment support plans for the project.
- Resolving production issues.
- Created QA handover doc template and Unit test result template.
- Created technical checkout template used for post deployment validations.
- Conducted KT to Production Support team.
- Performed effort estimation of the project and change requests.
- Ensured zero defect code delivery.
- Act as a gatekeeper to new changes in ESP Application.
- Foresee, Highlight, and discuss impact of new changes on existing application.
- Working with several Offshore team members to get the implementation done.
Confidential
Responsibilities:
- Responsible for key modules and in delivering solution to our customer in the Insurance sector.
- Deliver new and complex high-quality solutions to clients in response to varying business requirements.
- Responsible for managing scope, Analysis, Designing solution for various aspects of the project.
- Responsible for translating customer's requirement into technical solutions and then implementation of the same.
- Translate customer requirements into formal requirements and design documents, establish specific solutions, and leading the efforts including programming and testing that culminate in client acceptance of the results.
- Utilize in-depth knowledge of functional and technical experience in ETL designs and Database Migration systems and other leading-edge products and technology in conjunction with industry and business skills to deliver solutions to customer.
- Establish Quality Procedure for the team and continuously monitor and audit to ensure team meets quality goals.