We provide IT Staff Augmentation Services!

Etl - Data Stage Resume Profile

5.00/5 (Submit Your Rating)

Thousand Oaks, CA

PROFESSIONAL SUMMARY

  • Over 7 years of Dynamic career reflecting pioneering experience and high performance in System Analysis, design, development and implementation of SDLC/PDLC. Vast experience in data warehousing and UNIX Shell scripting.
  • Excellent Experience in Designing, Developing, Documenting, Testing of ETL jobs and mappings in Server and Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
  • Create DDL's on Netezza, updating the Netezza data model using ERWIN.
  • Creating and maintaining the Netezza utilities that will be generic
  • Monitor Netezza performance using NZADMINISTRATOR, seeing the query plan, expected execution time, monitoring system utilization and performance
  • Doing password changes on Netezza and datastage for various accounts
  • Transferring the contents of tables between 2 different environments using NZLOAD utility in Netezza
  • Using the AQT to create the external tables that can be transferred to another Netezza database
  • Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML.
  • Experience in analyzing the data generated by the business process, defining the granularity, source to target mapping of the data elements, creating Indexes and Aggregate tables for the data warehouse design and development.
  • Excellent knowledge of studying the data dependencies using metadata stored in the repository and prepared batches for the existing sessions to facilitate scheduling of multiple sessions.
  • Proven track record in troubleshooting of Data Stage jobs and addressing production issues like performance tuning and enhancement.
  • Expert in working on various operating systems like Red Hat Enterprise Linux 5, UNIX AIX 5.2/5.1,and Windows 2000/NT/XP.
  • Expert in unit testing, system integration testing, implementation and maintenance of databases jobs.
  • Effective in cross-functional and global environments to manage multiple tasks and assignments concurrently with effective communication skills.
  • Capable of meeting short deadlines, quickly adapting to change, and work well in a fast-paced environment.
  • Strong ability to multi-tasking while managing time and commitments effectively.
  • Managing the admin activities like Infosphere startup and shutdown
  • Managing various activities on datastage administrator like account addition for new resources, giving them access, creating new environmental variables.

TECHNICAL SKILLS

  • DataStage 8.5 ETL
  • Netezza
  • SQL, PL/SQL
  • Oracle 9i, 10 G
  • SQL Developer, Toad
  • Erwin
  • Crystal Reports
  • AIX Linux, bash, Korn, Unix Shell scripting
  • PERL Limited
  • Teradata Limited
  • DWH Concepts
  • MS VISIO

EDUCATIONAL SUMMARY

Computer Engineer Graduated with 'Bachelor of technology' degree from Nagpur University in 2000-2004

PROFESSIONAL EXPERIENCE

Confidential

ETL - Data Stage

Responsibilities:

  • Involved in providing ETL technical design review, development plan review, code review, test plans, and results as per best practices of IBM DataStage. Preparation of design documents High Level and Low level design using Visio tool.
  • Technical guidance and leadership to support data ETL warehousing application which involves IIS Data Stage and Quality stage on version 8.5.
  • Coordination with offshore team member in understanding the requirement from the client/business user, creating the project change request for the team and regarding any technical issues in Software development during the initial analysis phase till complete cycle.
  • Delivering new and Complex high quality ETL solutions to clients in response to varying business requirements.
  • Developing jobs using ETL tools Extraction Transformation and Load in Batch and Sequences to integrate the flow using Data Stage/Quality Stage with Linux scripting for complex business requirement.
  • Using Netezza nzload utility to load the data into tables. Resolving the issues faced when there is an error.
  • Parameters used in NZ: -u, -pw, -db, -df, -lf, -bf, -delim, -dateStyle, -dateDelim, -maxErrors etc
  • Implement the distribute on, organize on, ZONEMAP key implementation on Netezza to effectively retrieve the data from larger tables efficiently.
  • Extensively using Quality Stage components such as reference match, Standardize Stage, Match frequency, Survivor and Unduplicate match for standardizing the data.
  • Involved in the design and development of the data acquisition process for the data warehouse including the initial load and subsequent refreshes.
  • Involved in full integration test of all jobs within each sequence before deploying the jobs and sequencers from the Development environment Dev to the subsequent environments.
  • Analyzing system requirements specification, developing test plans and test cases to cover overall quality assurance.
  • Tuning Data Stage transformation and jobs to enhance their performance.
  • Managing the admin activities like Infosphere startup and shutdown
  • Doing password changes on Netezza and datastage for various accounts
  • Transferring the contents of tables between 2 different environments using NZLOAD utility in Netezza
  • Managing various activities on datastage administrator like account addition for new resources, giving them access, creating new environmental variables.

Environment: ETL, Data warehousing, Netezza, datastage 8.5, AQT, IBM AIX, UNIX Shell Scripting.

Confidential

ETL Teradata Data Stage Developer

Development: Confidential is one of the major wireless companies in Confidential. The company has huge volumes of data coming from different sources, which was generated from multiple and cross platform applications. The Project involved design and development of data mart for the sales department.

Responsibilities:

  • Design the datastage jobs
  • Development of Shell Scripts
  • Using BTEQ queries for retrieving the data
  • Proposed and implemented As-Of Reporting data for product, customer, and geography and sale territory by archiving them on a monthly basis. This helped in identifying data at a desired point in time in the past
  • Validated Account, Distributor, Customer, Channel partner details against order details stored in the dimension's table containing VIVO's order related data
  • Identified issues once data downloads into QA environment for the client manual invoices credit memo data and coordinated closely with development teams on the mismatch noticed
  • Performed analysis and validated the NetApp Billing and revenue related reserves data and their corresponding daily ,monthly, weekly, quarterly and yearly reports data
  • Validated the data integrity after the additional measures are added to client booking, billing, revenue facts tables and sales order, sales order line and sales order line extension dimension tables
  • Performed analysis and validation on the newly build supplier alternate hierarchy data for products and ensured that they are in sync with the corporate hierarchy
  • Review and approve data model
  • Preparation of Test strategy
  • Handling P1/P2 Issues
  • Testing of Sensitive and high priority time variant data
  • Loading Data to the Teradata Data Warehouse Tables from multiple data sources using Teradata utilities Fastload, Mload, BTEQ, Tpump
  • Used Fast Export to get the data from tables
  • Developed Mapplets using corresponding Source, Targets and Transformations
  • Implemented Aggregate, Filter, Join, and Expression, Lookup, Sequence generator and Update Strategy, XML transformations
  • Query resolution amongst team members
  • Maintaining Deliveries on time
  • Completing the testing along with the various other things, such as presenting it to Design Forum if the module is critical. Followed various steps that take part for a delivery

Environment: Datastage, Teradata , BTEQ, Teradata SQL assistant, UNIX Shell Scripting, Crystal reports, IBM AIX, ORACLE SQL

Confidential

ETL Developer

Project Description:

Confidential is the Data Processing Unit for Confidential. This project involves Enhancements, Testing. Review Change Requests and Code Change Requests sent from Onsite. Worked in the Property Mart Section of this Unit and responsible for various tasks including Enhancements in the weekly running Processes like IDC IDCentric Process-Data Standardization Process , Propid Process and Fusion Process. In addition to this I have also worked on Development modules such as Data Scrambling, Role Swap Project, and various Enhancement PCR's for Fusion Process and ensure timely deliveries of Project Execution Documents and meet CMMI level5 Quality Standards.

Responsibilities:

  • Designed and Created Parallel Extender jobs which distribute the incoming data concurrently across all the processors, to achieve the best performance
  • Understanding the functionality of the project analyzing design documents, estimation of coding efforts
  • Documentation, Analysis, Modification and making the strategy to execute the projects/tasks
  • Used Data Stage Parallel Extender stages namely Datasets, Sort, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing the ETL Coding
  • Used Quality Stage to coordinate the delivery, consistency, removing data anomalies and spelling errors of the source information
  • Tracking each item's progress with the team member and providing status to onsite coordinator and client on daily/weekly basis
  • Track the project by doing the Toll gate review of deliverables at the end of each stage of the methodology.
  • Worked on performance tuning and enhancement of Data Stage job transformations
  • Conducting reviews before sending the deliverables to the client
  • Prepare all the delivery documents
  • Communicated with end users and Functional users and resolved issues
  • Developed SQL Scripts to validate the data after the loading process
  • Trained in the Six Sigma quality Process
  • Maintain the delivery and progress status

Environment: Ascential DataStage7.5.2 DataStage, Quality Stage, Information Analyzer, Metadata Workbench, Business Glossary , DB2/400, RPG/400, RPG-ILE, CL/400, SQL/400

We'd love your feedback!