We provide IT Staff Augmentation Services!

Etl Lead/developer Resume

ChicagO

SUMMARY:

  • 11 years of IT experience with Data warehouse applications using with Informatica, Datastage and BI tools OBIEE and Databases are Oracle 10g/9i/8i SQL Server and Teradata
  • Extensive experience in Pharmacy, Healthcare, Banking, Insurance, Trading and Retail.
  • Ability to analyze source systems and business requirements identify and document business rules design data architecture.
  • Extensive Knowledge in Dimensional Data modeling Star Schema modeling Snow - Flake modeling Fact and Dimension tables Physical and logical data modeling using with Erwin data modeling tool.
  • Having experience with Onsite/Off-shore model and managing team.
  • Exposure on Hadoop components are Hive, Pig, Sqoop.
  • Strong experience with Data Migration/Acquisition and Integration projects.
  • Experience in OLTP/OLAP system study, analysis and ER modeling, developing Database schemas like Star schema and Snowflake schema used in relational and multidimensional modeling by using Erwin, Visio
  • Experience with the Informatica Data Quality (IDQ), IDS, Master Data Management (MDM) and Informatica B2B Data Transformation.
  • Experience with Informatica BDE to integrate data.
  • Experience with Data Profiling, Data lineage, Score cards and Data standardization using with Informatica Developer/Analyst/MDM.
  • Extensive experience on Informatica Data Quality (IDQ) components are Data Profiling, Data Validation, Data Cleansing and Data Masking especially while working on Source file system.
  • Extensive experience in implementing CDC using Informatica Power Exchange 8/9.
  • Exposure on Hadoop components like HIVE, PIG, Mongo DB and HDFS.
  • Expertise in Administration tasks including Importing/Exporting mappings, copying folders over the DEV/QA/PRD environments, managing Users, Groups, associated privileges and performing backups of the repository.
  • Exposure on BI Analytical tools are OBIEE and Cognos.
  • Developed Complex mappings from varied transformation logics like Unconnected /Connected lookups, Router, Filter, Expression, Aggregator, Joiner, Union, Update Strategy and more.
  • Experience on Python scripting.
  • Expertise in using UNIX and writing Perl and UNIX shell scripts.
  • Experience in Performance Tuning of sources, targets, mappings, transformations and sessions, by implementing various techniques like partitioning techniques and pushdown optimization, and also identifying performance bottlenecks.
  • Good exposure to Development, Testing, Debugging, Implementation, Documentation, End-user training and Production support.
  • Expertise in doing Unit Testing, Integration Testing, System Testing and Data Validation for Developed Informatica Mappings.
  • An excellent team member with an ability to perform individually, good interpersonal relations, strong communication skills, hardworking and high level of motivation. A quick learner with an aptitude for taking responsibilities.

TECHNICAL SKILLS:

ETL Tools: Informatica PowerCenter 10.1/9.6 / 9.5 / 9.1 / 8. X, IDQ,B2B, Datastage.

Databases: Oracle 11g / 11i / 10g / 9i /8i, SQL Server 2008 R 2 / 2008 / 2005 / 2000, Teradata, Mongo DB, Sybase, DB2, MS Access 7.0 / 97 / 2000, Teradata 13.0.

Programming Languages: SQL, PL/SQL, Python, Core Java, UNIX Shell Scripting.

Operating Systems: UNIX, Windows 7/XP/2003/2008, Linux

Business Intelligence Tools: SAP BO, OBIEE

Hadoop Components: Pig, Hive, Sqoop

DB Tools: SQL Server Management Studio, TOAD, Vertica, SQL *Plus, SQL *Loader, TD SQL Assistant 13.10Other Tools: MS Visio, Web Services, Erwin, Visio, DAC, Tivoli

PROFESSIONAL EXPERIENCE:

Confidential, Chicago

ETL Lead/Developer

Responsibilities:

  • Working as application architect to implement end to end application of migration project.
  • Analyzing existing architecture of the project and preparing Design documents.
  • Translate business requirements into technical design documents.
  • Interacted with business users and gathering the requirements which are needs to be done while migrating.
  • Involving into logical and physical data models using CA Erwin Data Modeler based on business requirements and also followed by existed architecture.
  • Integrating Doccumentum and TRACTS (Teradata) data into SDAp using with PC/IDQ.
  • Involving into Data analysis and Data profiling using with IDQ.
  • All facets of IDQ implementations including Data Profiling metadata acquisition data migration validation reject processing and pre landing processing
  • Performing requirement gathering analysis design development testing implementation support and maintenance phases of both IDQ and Data Integration Projects
  • Informatcia Data Quality IDQ Data Integration concepts in large scale implementation environments
  • Connecting to Doccumentum system and integrating with TRACTS data into SDAp system using MDM Integration.
  • Parsing data from Semi structured/Unstructured data to XML file and then XML file to Relation tables using with DataTransformation.
  • Master Data Management MDM Data Integration concepts in large scale implementation environments.
  • Implementing Data profiling along with Scorecards for Business users to understand of data using with Informatica Developer/Informatica Analyst.
  • Created Data lineage on Data Profiling columns using with MDM. Adding strategies based on requirement for some of transformation in IDQ where ever required.
  • Debugged the mapping of the failed session.
  • Analyzed and Created Facts and Dimension Tables after data got loaded.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Data validation between existing processes to new process.
  • Using DAC to schedule ETL jobs along with generation of dynamic parameter file.
  • Analyzed the database and provide detailed information on their operational and Forecast for future years.
  • Generating reports with Cognos.
  • Implementing enhancements and taking care deployment till Production.
  • Supporting to production issues.

Environment: INFORMATICA 10.1, IDQ, MDM, Teradata, Oracle 11g, Cognos, Documentum

Confidential, Foster City, CA

ETL Lead/Developer

Responsibilities:

  • Working as application architect to implement end to end application of migration project.
  • Analyzing existing architecture of the project and preparing Design documents.
  • Translate business requirements into technical design documents.
  • Interacted with business users and gathering the requirements which are needs to be done while migrating.
  • Involving into logical and physical data models using CA Erwin Data Modeler based on business requirements and also followed by existed architecture.
  • Leading medium size offshore team.
  • Involving into Data analysis and Data profiling using with IDQ.
  • All facets of IDQ implementations including Data Profiling metadata acquisition data migration validation reject processing and pre landing processing
  • Performing requirement gathering analysis design development testing implementation support and maintenance phases of both IDQ and Data Integration Projects
  • Informatcia Data Quality IDQ Data Integration concepts in large scale implementation environments
  • Implementing Data Acquisition, Data Analysis in Data quality and Data standardization using with Informatica PIM.
  • Connecting to Doccumentum system and integrating with TRACTS data into SDAp system using MDM Integration.
  • Parsing data from Semi structured/Unstructured data to XML file and then XML file to Relation tables using with DataTransformation.
  • Master Data Management MDM Data Integration concepts in large scale implementation environments
  • Preparing match rules while integrating/Matching with Master Data in MDM.
  • Data validation between existing processes to new process.
  • Performing requirement gathering analysis design development testing implementation support and maintenance phases of both MDM and Data Integration Projects
  • Master Data Management MDM Data Integration concepts in large scale implementation environments
  • Defining Trust Score Validation rules Match rules and Merge settings
  • Closely worked with other IT team members business partners data stewards stakeholders steering committee members and executive sponsors for all MDM and data governance related activities
  • Involved in Unit and Integration testing of Mappings and sessions.
  • Supporting to production issues.

Environment: INFORMATICA 10.1, Oracle 11g, IDQ, OBIEE, DAC

Confidential

ETL Technical Lead

Responsibilities:

  • Was responsible in creating the Teradata Financial Service Logical Data Modeling architecture followed by Physical, Symantec and Presentation layer.
  • Interacted with business users in gathering the requirements in data usage and report generation.
  • Managing offshore team and supporting them at Technical and Functional perspective.
  • Prepared technical and non-technical documents such as design specifications, operation guides, process flows,
  • Implemented Teradata scripts are Teradata Parallel Transporter.
  • Successfully Loaded Data into target ODS from the source system SQL Server Database into the Staging table and then to the target database Oracle.
  • Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
  • Worked as a Business analyst, Responsible in gathering Requirements and IT review. Interacted with Business Users in the design of technical specification documents.
  • Loaded data from different source file systems like Flat files, EDI to Teradata for couple of subject areas.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
  • Analyzed and Created Facts and Dimension Tables.
  • Involving into Data analysis and Data profiling using with IDQ.
  • All facets of IDQ implementations including Data Profiling metadata acquisition data migration validation reject processing and pre landing processing
  • Performing requirement gathering analysis design development testing implementation support and maintenance phases of both IDQ and Data Integration Projects
  • Informatcia Data Quality IDQ Data Integration concepts in large scale implementation environments
  • Implementing Data Acquisition, Data Analysis in Data quality and Data standardization using with Informatica PIM.
  • Creating business rules to reduce duplicate in IDQ work which will analyze in IDQ analyst tool.
  • Adding strategies based on requirement for some of transformation in IDQ where ever required.
  • Developed Informatica Batch/Real-time (CDC) processes to feed.
  • Used Informatica Power Exchange for loading/retrieving data from mainframe system.
  • Developed MDM Hub Match and Merge rules Batch jobs and Batch groups.
  • Created IDQ mappings and Mapplets on different sources for validating
  • Prepared Python scripting to get Online queries/web based applicants data from Customers.
  • Involved in Unit and Integration testing of Mappings and sessions.
  • Assisted Testing team in creating test plan and test cases.

Environment: INFORMATICA 9.6, OBIEE, Web Services, Java, Hadoop, Tearadata13.0, IDQ, MDM

Confidential

Informatica Lead/Admin

Responsibilities:

  • Project implemented as Onsite/Offshore model
  • Interacted with business users in gathering the requirements in data usage and report generation.
  • Involved in creation of Informatica Repository architecture followed by Kimball model.
  • Created and monitored Database maintenance plans for checking database integrity, data optimization, rebuilding indexes and updating statistics.
  • Acted as a team lead for development team.
  • Successfully Loaded Data into different targets from various source systems like Oracle Database, Flat files, XML Files, SQL Server, etc. into the Staging table and then to the target database Oracle.
  • Developed various Mappings, Mapplets, and Transformations for data marts and Data warehouse.
  • Worked as a Business analyst, Responsible in gathering Requirements and IT review. Interacted with Business Users in the design of technical specification documents.
  • Re-designed ETL mappings to improve data quality.
  • Created Stored procedure transformations to populate targets based on business requirements.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed. Debugged the mapping of the failed session.
  • Used Pipeline Partitioning feature in the sessions to reduce the load time.
  • Analyzed and Created Facts and Dimension Tables.
  • Used Informatica features to implement Type I, II, and III changes in slowly changing dimension tables.
  • Created Data Breakpoints, Error Breakpoints for debugging the mappings using Debugger Wizard.
  • Involved into upgradation of Informatica versions
  • Involved into Admin activities.
  • Developed the pre & post session shell scripts, which will create the parameter file dynamically.
  • Written documentation to describe program development, logic, coding, testing, changes and corrections.
  • Within specified time, projects are delivered making sure all the requirements gathering, business analysis and designing of the data marts.
  • Involved in Unit and Integration testing of Mappings and sessions.
  • Developed Schedules for daily and weekly batches using Unix Maestro.
  • Prepared ETL mapping specification document.
  • Assisted Testing team in creating test plan and test cases.

Environment: Informatica 8.6, Oracle 10g, UNIXConfidential

ETL Lead Developer

Responsibilities:

  • Responsible for Business meeting, conversion of business rules into technical specifications on trading day by day activities.
  • Project implemented as Onsite/Offshore model
  • Implemented Sim Corp to integrate all financial applications.
  • Prepared PL/SQL scripts (Stored Procedures, Functions and Packages).
  • Performance Tuning in SQL scripts level.
  • Prepared PL/SQL blocks for single execution purpose for few infotypes.
  • Performance tuning of SQL queries used in extraction query to Source system.
  • Create indexes and Functional indexes on required tables
  • Worked with INFORMATICA Power Center client tools like Repository Manager, Designer, Workflow Manager, and Workflow Monitor
  • Worked with Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Developer, and Mapping Designer
  • Worked with Task Developer, Worklet Designer, and Workflow Designer in the Workflow Manager
  • Responsible for error handling using Session Logs, Reject Files, and Session Logs in the Workflow Monitor
  • Extensively worked with the Debugger for handling the data errors in the mapping designer

Environment: Informatica 8.6, Oracle 10g, Agile

Confidential

Senior Developer

Responsibilities:

  • Involved in meeting with business to gather requirements.
  • Provide technical leadership to other support team members and resolution of technical issues
  • Provide recommendations on database index strategies based on performance metrics
  • Reviewed data models and doing code reviews.
  • Created complex stored procedures, functions, triggers and packages.
  • Creates and maintains the overall and detailed project plan(s) and supervise the D/W ETL processes.
  • Used the Datastage to develop jobs for extracting, transforming, integrating, and loading data into the Staging and Integration layers.
  • Migrated existed Pentaho jobs into Datastage jobs for couple of subject areas.
  • Used transformer, lookup and aggregator stages in designer to achieve complex business logics
  • Used Key Management functions Surrogate Keys were generated for composite attributes while loading the data into Data Warehouse.
  • Developed user defined stages for implementing Complex business logic.

Environment: DATASTAGE 8.0, Pentaho, Oracle 10g

Confidential

Team Lead

Responsibilities:

  • Project implemented as Onsite/Offshore model
  • Responsible for Business meeting, conversion of business rules into technical specifications
  • Prepared PL/SQL scripts (Stored Procedures, Functions and Packages).
  • Performance Tuning in SQL scripts level.
  • Prepared PL/SQL blocks for single execution purpose for few infotypes.
  • Performance tuning of SQL queries used in extraction query to Source system.
  • Worked with Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Developer, and Mapping Designer
  • Worked with Task Developer, Worklet Designer, and Workflow Designer in the Workflow Manager
  • Responsible for extracting data from Oracle and Flat files
  • Data Quality Implementation-Informatica Developer (IDQ)
  • Involving into Data Quality like Data analysis, Data cleansing, Data profiling with capabilities of IDQ.
  • Prepared Data quality business rules
  • Created GUI related dashboards for Client understanding
  • Excellent knowledge on Informatica platform as a whole and the integration among different Informatica components and services.
  • Responsible for Performance in Informatica PowerCenter at the Target Level, Source level, Mapping Level, Session Level, and System Level
  • Extensively worked with both Connected and Un-Connected Lookups

Environment: Informatica 8.6, Oracle 10g

Confidential

Software Consultant

Responsibilities:

  • Building Jobs, based on functional Mapping Specs for different subject area especially on HR Payroll Acc.
  • Created Sequence and Parallel jobs based on Mapping Spec.
  • Prepared PL/SQL scripts (Stored Procedures, Functions and Packages).
  • Created list reports Prepared PL/SQL blocks for single execution purpose for couple of infotypes.
  • Read data from different applications like Oracle ERP and EBS to integrate into SAP ABAP.
  • Create indexes and Functional indexes on required tables
  • Debugging in Data stage job Involved in UNIT and Performance Testing and prepared Documentation. Prepared Technical Specifications Documents.
  • Involved in Reconciliation and User acceptance Testing.
  • Migrated Data from different type of financial modules are like Oracle Apps GL, AP and PO.
  • Manually modified the SQL in ReportStudio to tune and/or to write complicated reports. Used union/Join objects in ReportStudio
  • Generated various List Reports, Grouped Reports, Cross tab Reports, Chart Reports, Drill-Down and Drill-Through Reports.
  • Created Dashboards to present critical company data in a single report.
  • Development of jobs and sequencers for extracting the data from the variable data sources such as flat files, main frame database etc. loading modules using the DataStage designer tools for loading the data into the data warehouse as per the specified design.
  • Developer from ETL Development side.
  • Creation of the low level and high level design documents.
  • Preparation of Unit Test cases and Unit test results document and End-End testing of the jobs.

Environment: Datastage 8.0, SAP BO, Oracle 10g, Oracle ERP, SAP ABAP

Confidential, Baltimore, MD

ETL Developer

Responsibilities:

  • Prepared PL/SQL scripts like Procedures, Functions and Packages.
  • Involved into performance tuning in Database and ETL mapping level.
  • Performance tuning of SQL queries used in extraction query to Source system.
  • Created Views and materialized views for Analysis team.
  • Created Simple, Complex and Functional indexes on required tables.
  • Worked with Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Developer, and Mapping Designer
  • Worked with Task Developer, Worklet Designer, and Workflow Designer in the Workflow Manager
  • Responsible for extracting data from Oracle, Sybase, and Flat files
  • Extensively worked with both Connected and Un-Connected Lookups
  • Extensively worked with Look up Caches like Persistent Cache, Static Cache, and Dynamic Cache to improve the performance of the lookup transformations.

Environment: Informatcia 7X/8X, Oracle 9i

Hire Now