Data Analyst & Senior Etl Developer Resume
4.00/5 (Submit Your Rating)
SUMMARY
- Around 9+ years of Experience in Data warehousing applications using DataStage(ETL) and UNIX Operating System
- Strong programming experience of UNIX shell scripting to support and automate the ETL process.
- Experience in developing wrapper scripts for DataStage jobs.
- Automated the process and created new shell scripts to reduce the redundancy work by the team, so work can be done in more efficient and productive manner.
- Good noledge on Health Care, Banking and Insurance domains.
- Good Knowledge on Hadoop Basics (HDFS, Map Reduce, Hive, Scoop, HBase)
- Good Knowledge on Python Scripting.
- Good Knowledge and working experience on AWS (S3, Atana and AWS Glue).
- Extensively performed Data Modeling, Testing, Design logical and physical data warehouse schema, Debugging, Troubleshooting, Monitoring and Performance Tuning.
- Good exposure to multiple relational databases including Oracle, SQL, DB2, Mongo DB, Teradata & MS SQL Server
- Certified IBM Infosphere Datastage 8.5
- Certified IBM Certified Database Associate - DB2 9
- Experience working wif data modelers to translate business rules/requirements into conceptual/logical dimensional model and worked wif complex data models
- Analysis of Business and system requirements including impact analysis of existing systems and create detail requirements wif the consultation of Business users and technical architects
- Effective working relationships wif client team to understand support requirements, and effectively manage client expectations.
- Good experience onAgile Methodologyand the Scrum process.
- Excellent analytical and problem-solving skills. Good communication and interpersonal skills, wif ability to interact wif individuals at all levels.
- Capable of working independently, as well as a team player.
TECHNICAL SKILLS
ETL Tools: DataStage versions 8.5/8.7/9.1, 11.3/11.5/11.7 , Informatica
Operating systems: UNIX, Windows
Concepts: Data warehousing and Business Intelligence, Hadoop Basics HDFS/Map Reduce/Hive/Sqoop/HBase
Scripting: SHELL,PYTHON,PERL
Databases: Oracle, DB2, SQL (2005, 2007), Teradata, Mongo DB & MS SQL Server.
Languages: Core Java, C
Scheduler Tools: CA 7, Tivoli Work Scheduler, Autosys, Control-M, Crontab.
Methodologies: Agile, Waterfall
PROFESSIONAL EXPERIENCE
Confidential
Data Analyst & Senior ETL developer
Responsibilities:
- Perform data modeling and design the logical and physical data warehouse schema.
- Developing Unix Scripts for validating the incoming files before running the Datastage job.
- Develop DataStage jobs to load the data into application tables, perform query optimization and tuning.
- Create and schedule jobs using crontab to load the data into tables.
- Test and validate the data in each layer of Application.
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Prepare the Deployment document and deploy the code from Dev to higher environments.
- Identify the problems/issues in production and resolve wifin Application Service Level Agreement (ASLA).
- Educate the line of business on the changes moved to the production
- Root cause analysis and providing stability to the Application.
- Provides support to application Users as necessary.
Confidential
Senior ETL Developer
Responsibilities:
- Develop DataStage jobs to standardize, validate and generate xml files to load the data into MDM tables.
- Get the requirement from the SA team and discuss wif SME’s
- Create and schedule jobs using crontab to load the data into tables.
- Created new automation test scripts for testing the functionality of the application.
- Developed new shell scripts for creating a base line file structure required for Datastage job to run and for some validation on the file before running the job.
- Test and validate the data in each layer.
- Prepare the Deployment document and deploy the code from Dev to higher environments.
- Identify the problems/issues in production and resolve wifin Application Service Level Agreement (ASLA).
- Schedule jobs using Crontab.
- Educate the line of business on the changes moved to the production
- Root cause analysis and providing stability to the Application.
Confidential
Production Support & ETL Developer
Responsibilities:
- Monitoring the Batch cycle for the various applications.
- Working on Service Requests/Incidents by clients, which may involve process improvements, enhancements or maintenance of the existing system
- Resolving ABEND incidents to meet SLAs and ensure the batch Cycle is unperturbed.
- Discretionary projects dat include product, strategic and CTO initiatives.
- Attack the Known Error Problems and reduced incidents.
- Areas of Improvement/Value Addition.
- Cross Training.
- Documentation - Induction Manual, System Manual, Design Documents, Unit Test Plan, Unit Test Results.
- Known Error Database.
- Continuous improvements - Proactive monitoring and controls, Incident trend analytics, Weekly outage lessons learned, RCA and Fixes.
- Metrics Management - SLA, Service Requests capacity management, Problem ticket management, Defect and priority management
- Month-end and Quarter closure support.
- Batch Support in handling anyserver outage (unplanned outage due to faulty CPU.)
- Access Requests.
- User enquiries.
- Extended coverage in support of weekend activities & during team member absence.
- Handling on OSDW User’s multiple performance issues for long running Queries.
- Handled multiple scheduling & Look back issues.
- Data Replication from Production to Integration/Development Tables - Refresh
- Understanding the system & processes dat are involved
- Created new shell scripts for validating the files and performing some modifications on the file before sending the files to the Datastage.
- Created many scripts in SHELL which stabilized the account and reduced unwanted tickets in the bin.
- TEMPHas the capability to handle the teams successfully. Coordinated and guided individuals to the higher roles.
- Took training's and sessions to the team and educated them on Technical and Process prospective.
Confidential
ETL Developer
Responsibilities:
- Analyzed the requirements provided by the customer and prepared Design Documents based on the requirements
- Working wif an Agile, Scrum methodology to ensure delivery of high-quality work wif every monthly iteration/Sprint
- Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
- Designed and developed complex jobs by using DataStage
- Developed new tools by using Unix shell script to reduce the effort/redundancy work in this project, to provide all the details of the Data Stage job
- Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.
- Generated queries using SQL to check for consistency of the data in the tables and to update the tables as per the Business requirements.
- Involved in Performance Tuning
- Good understanding of source to target data mapping and Business rules associated wif the ETL processes.
- Used shell scripts for automating the execution of maps.
- Created shell scripts for validating the incoming files from other sources.
- Defect tracking, monitoring and on time delivery
- Identify the problems/issues in production and resolve wifin Application Service Level Agreement (ASLA).
- Scheduling jobs using Autosys,TWS