We provide IT Staff Augmentation Services!

Etl Developer Resume Profile

5.00/5 (Submit Your Rating)

Professional Experience Summary:

  • Around 8 years of IT experience in Banking and Finance Services BFS domain as IBM InfoSphere DataStage Developer and Mainframe Programmer well knowledgeable with DB2 and Oracle Databases. Experienced in Client/Server business systems and Data warehousing. Worked in IBM Data Stage Enterprise 8.7/8.5/8.1 Client-Server environment.
  • Involved in various phases of projects' Software Development Life Cycle SDLC such as analysis, design, construction and testing, deployment and production support.
  • Involved in day to day interactions with the Business to understand their requirements and creating design documents. Designing, Compiling, Testing, Scheduling and running Data Stage jobs.
  • Expert in understanding and working on dimensional modeling Star Schema modeling .
  • Extensive experience in Mapping, Extract, Transformation and Load data from heterogeneous sources like Oracle 11g, 10g, 9i / DB2/ and other sources like Flat files into DataStage.
  • Experienced in trouble shooting Datastage jobs and addressing production issues like failures, performance tuning and design enhancement.
  • Proficient in writing Unix Shell Scripting and SQL queries.
  • Worked on a Datastage jobs migration from InfoSphere DataStage v 8.5 to v 8.7 and migrating jobs from Mainframe to Datastage.
  • Excellent analytical, problem solving, communication and interpersonal skills, with ability to interact with individuals at all levels.
  • Roles and Responsibilities:
  • Responsible for evaluation of daily ETL development processes, design and code review procedures and mentoring team members.
  • Provided technical assistance in scheduling, execution and monitoring of ETL processes through utilization tools like IXP/ Autosys.
  • Coordinated with ETL architects and process analysts for development and execution of system and integration test plans.
  • Executed processes for performance tuning of applications to meet operational production needs.
  • Worked on Parallel Jobs and Job Sequencers as part of Datastage design in all the projects.
  • Suggested enhancement and optimization procedures for all ETL and Data stage processes.
  • Involved in translating business requirements/functional design into High level design documents.
  • Involved in preparing High level design Low level design documents.
  • Involved in the Analysis, Design and Testing of the solution/system/ application.
  • Created the Test Strategy, test scenarios, test procedures as well as project documentation.
  • Created extensive SQL queries for data extraction to test the data against the DB2 databases.
  • Collaborate with Business Analysts to clarify application requirements.
  • Follow procedures and standards set by the project.
  • Perform structured application code reviews and walkthroughs.
  • Prepare test plan and test data for assigned work, perform unit/program test and apply fixes as needed.
  • Identify the technical cause and potential impact of errors and implement coding or configuration changes.
  • Create/Update documentation for the application as code changes are applied.
  • Participating in pre and post implementation support activities.
  • Conducted weekly status meeting with the Business/Executive team.
  • Coordinating with Business for UAT, Status meetings.
  • Training Mentoring new comers as and when required

Confidential

ETL Developer

data coming from various disparate systems. This consolidated DQAR data warehouse houses all client and account related information to provide an adaptable, integrated data model which enables flexible reporting/monitoring , pro-active identification of data quality issues, easily extract analyze data from this integrated model eliminating need to write redundant ETL code and do extensive coding for the client data stored from multiple systems.

Responsibilities :

  • Work as the lead analyst to co-ordinate with clients to understand the requirements and prepare High
  • level and Low-level design documents from the requirements specification
  • Work as a developer to create Data stage jobs and provide the specification to offshore team for the
  • Data stage development work, as well as co-ordinate the QA activity in the QA environment
  • Work with the BA to complete finalizes the system requirement give input on existing system
  • Process for future enhancements
  • Co-ordinate with offshore team members and testing team in the Build and Test Phase.
  • Effectively co-ordinate with team to get project implemented on time.
  • Worked as Lead Developer in migrating the project from 8.5 to 8.7 Grid environment.
  • Develop archival and removal jobs for effective usage of server space.
  • Involve in Unit Testing and Defect fixing.
  • Involve in creation of JIL and scheduling the jobs in Autosys.
  • Involve in moving the jobs to production
  • Handle Daily, weekend and Monthly Production support
  • Performance Tuning.

Environment: Oracle, Datastage V 8.5, 8.7, UNIX, IXP/Autosys.

Confidential

ETL / Mainframe developer

Description: Global Business Intelligence Services GBIS system which supports the Bank of America - Merrill Lynch Data warehousing system called MIDAS. The MIDAS Data Warehouse is one of the Merrill Lynch's largest aggregator of data. This system sources the data from other Bank of America - Merrill Lynch system and stores the information in DB2 database. A repository leveraged throughout Global private client and other divisions to satisfy the business community's Decision Support requirements. Business Intelligence Services BIS , the technology group responsible for supporting MIDAS, collect and consolidate the data from both internal systems and external sources. Its process transforms the source data according to the business rules defined by its clients and load the result to a distributed platform for access by authorized users. This system is also used to generate reports on historical data. Global Business intelligence Services GBIS application runs on Mainframe and Datastage using the technologies: COBOL, JCL, VSAM and DB2 .Datastage 8.1.

Responsibilities:

  • Co-ordinate with clients to understand the requirements and prepare High level and Low-level design
  • documents from the requirements specification.
  • Co-ordinate with offshore team members in the Build and Test Phase.
  • Do impact analysis and define Test plan, test cases.
  • Design and develop Mainframe Datastage jobs to match the business transformations using different Stages.
  • Work with the build, configuration, and application testing teams as well as component owners,
  • Infrastructure teams and enterprise Change Control groups for large scale and major releases,
  • independent initiatives, and technical environment upgrade/enhancement initiatives.
  • Execute Test Cases and get signoff from the users.
  • Effectively co-ordinate with team to get project implemented on time.
  • Identifying and creating the datasets as the intermediate stages for effective reusable purpose.
  • Develop archival and removal jobs for effective usage of server space.
  • Involve in Unit Testing and Defect fixing.
  • Providing the UAT support.
  • Involved in creation of JIL and scheduling the jobs in Autosys.
  • Involved in moving the jobs to production and warranty support
  • Performance Tuning.
  • Prepare Unit Test Specification
  • Provide UAT Support
  • Project Monitoring and metrics reporting
  • Handle Datastage Abends.
  • Create SWAT reports.

Environment: Unix, Ascential Datastage 7.5, DataStage 8.1 IBM InfoSphere Information Server , IXP, Autosys, Mainframes, OS 390, COBOL, JCL, DB2, Endevor, SPUFI, QMF, STROBE, Platinum, JCLSYNC, REVINE, TEST, SMART TEST.

Confidential

Mainframe developer

About the project FS1 Tracking project is the enhancement to the FS1 Accounting project. This project deals with major maintenance request which fall under conversion of MLTRAC calculator modules to execute in Batch mode. As part of this project another major request being handled is building the historic tables for various fund groups. This will decommission legacy file systems and convert all the historical records in to MLTRAC historic database. Out of these enhancements the respective fund groups will be able to retrieve the account level historic transactions and provide the analysis based on the report

  • Responsibilities:
  • Requirement study
  • Impact analysis for the migration process, performance tuning new requirement or change request on the existing programs initiated by client.
  • Preparation of detailed design documents
  • Coding, Creation of Test Plans
  • Performing unit testing, System testing UAT.
  • Performance tuning Re-engineering.
  • Reviews ensure the overall quality of the project
  • Interaction with the Onsite Team and clients in the requirement and testing phase
  • Interacting with On-site coordinator on a regular basis on the deliverables and update the status.
  • Environment: IBM Mainframes, OS 390, COBOL, JCL, DB2, ENDEVOR, FILE-AID, SPUFI, REVINE, JCLSYNC, TEST, SMART TEST

Confidential

MAINFRAME Developer

  • Performed several basic operations in ISPF such as Create, copy, listing the datasets. Preparing job statements in JCL for creating PS and PDS for the given specifications and tested various conditions, performed various operations
  • Using IEBGENER, IEFBR14, IEBCOPY, SORT IBM utilities, reading, and writing and modify VSAM files.
  • Implemented a system using COBOL DB2 to handle various
  • Functionalities such as inserting, deleting, updating and creating report files.
  • Responsibilities
  • Requirement gathering
  • Design
  • Coding
  • Unit Testing
  • User Documentation
  • Package and Delivery
  • Solution Environment IBM Mainframes, COBOL, DB2, IEBGENER, IEFBR14, IEBCOPY, SORT IBM

We'd love your feedback!