We provide IT Staff Augmentation Services!

Etl Informatica Power Center Developer Resume

3.00/5 (Submit Your Rating)

Las Vegas, NV

PROFESSIONAL SUMMARY:

  • Sr  professional with around 6 years of experience that composes of strong Technical and Problem - solving skills in Business Intelligence, ETL Informatica Power Center and database developer.
  • Worked with Confidential versions 15/14/13, Informatica Power Center 10.1/9.5/9.1/8.6, Informatica Data Quality (IDQ) 9.5/9.1 as ETL tool for extracting, transforming and loading data from various source data inputs to various targets.
  • Worked extensively with Confidential utilities - Fast load, Multi load, Tpump and Confidential Parallel Transporter (TPT) to load huge amounts of data from flat files into Confidential database.
  • Broadly used Fast export to export data from Confidential tables.
  • Generated BTEQ scripts to invoke various load utilities transform the data and query against Confidential database.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Extensive experience in integrating data from flat files - fixed width, delimited, XML, Web Services by using various transformations available in Informatica such as - Source qualifier, XML parser, and Web services consumer transformation.
  • Performed with various user groups and developers to define TASM Workloads, developed TASM exceptions, implemented filters and throttles as needed.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, join and hash indexes in Confidential database.
  • Industry experience in using query tools like TOAD, SQL Developer, PLSQL developer, Confidential SQL Assistant and Query man.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
  • Magnificent experience with Scheduled and ran Tivoli Workload Scheduler (TWS V.8.4) job streams and jobs requested by Applications support. Created streams and jobs for day and night batch runs.
  • Worked in Tableau environment to create dashboards like Yearly, monthly reports usingTableau desktop & publish them to server.Converted Excel Reports toTableau Dashboard with High Visualization and Good Flexibility.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Confidential as well as Oracle.
  • Experience working with Confidential PDCR utility.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Wrote Confidential Macros and used various Confidential analytic functions.
  • Extensive knowledge of data warehouse approaches - Top down (Inmon's approach) and Bottom up (Kimball's approach), methodologies- Star Schema, Snowflake
  • Good knowledge on Confidential Manager, TDWM, PMON, DBQL.
  • Expertise in transforming data imported from disparate data sources into analysis data structures, using SAS functions, options, ODS, array processing, macro facility, and storing and managing data in SAS data files.
  • Extensively used various Informatica Power center and Data quality transformations such as - source qualifier, aggregator, update strategy, expression, joiner, lookup, router, sorter, filter, web services consumer transformation, XML Parser, address validator, comparison, consolidation, decision, parser, standardizer, match, merge to perform various data loading and cleansing activities
  • Extensive knowledge on scheduling tools - Control-M, Autosys, Tivoli (TWS), ESP and CRON.
  • Extensively used Control-M enterprise manager to schedule jobs, perform initial data loads, data copy from one environment to another when the environment is initially setup.

PROFESSIONAL EXPERIENCE:

Confidential, Las Vegas, NV

ETL Informatica Power center Developer

Responsibilities:

  • Analyzing the requirements and making functional specification by discussing with business user groups and translate the Business Requirements and documented source-to-target mappings and ETL specifications.
  • Experience in migrations of project's application interfaces.
  • Implemented PL/SQL queries, triggers and Stored Procedures as per the design and development related requirements of the project.
  • Handling the File transfer through SFTP Scripts, which are running throughInformaticaand also having some Unix Shell Scripts used to send mails to Clients whenever there is success or failure depending upon the Business requirements to the Customers.
  • Extensively used theTeradatautilities like BTEQ scripts, MLOAD and FLOAD scripts to load the huge volume of data.
  • Worked on moving the code from DEV to QA to PROD and performed complete Unit Testing, Integration Testing, user Acceptance Testing for the three environments while moving the code.
  • Extracted data from FTP site and Loaded data into HIVE and then extracted Data from HIVE and loaded into Confidential
  • Developed HIVE queries for various requirements and also developed Hive job to merge incremental file.
  • Worked with Source Analyzer, Warehouse Designer, Transformation designer, mapping designer and Workflow Manager to develop new Mappings & implement data warehouse.
  • Designed ETL processes for optimal performance.
  • Used Confidential Data Mover to copy data and objects such as tables and statistics from one system to another.
  • Used Confidential Data Mover for overall data management capabilities for copying indexes, global temporary tables
  • Other Confidential duties include managing workloads and performance using Confidential TASM, Confidential Dynamic Workload Manager, and Viewpoint. Managing Viewpoint (defining Viewpoint portlets, managing access, provide Viewpoint training to users, creating alerts).
  • Scheduled informatica jobs to trigger BTEQ Scripts with the help of ESP job scheduler.
  • Automated jobs using korn shell scripting SAS and Confidential jobs, stage flat-files data onto UNIX environments to support ETL processes.
  • Performed system level and application level tuning and supported the application development teams for database needs and guidance using tools and utilities like explain, visual explain, PMON, DBC views.
  • Created complexInformaticamappings, re-usable transformations and prepared various mappings to load the data into different stages like landing, staging and target tables.
  • Worked with cross-functional teams to resolve the issues.
  • Worked in Tableau environment to create dashboards like weekly, monthly, daily reports using tableau desktop
  • Used the debugger inInformaticato test the mapping and fix the mappings.
  • Prepared various mappings to load the data into different stages like Landing, Staging and Target tables.
  • Used various transformations including Source Qualifier, Expression, Aggregator, Joiner, Filter, Lookup, Update Strategy for designing and Optimizing.
  • Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
  • Solid experience in performance tuning onTeradataSQL Queries andInformaticamappings
  • Worked onTeradataSQL Assistant,Teradataadministrator,Teradataview point and BTEQ scripts.
  • Tuned the performance of mappings by followingInformaticabest practices and also applied several methods to get best performance by decreasing the run time of workflows.
  • Involved in the defect analysis for UAT environment along with users to understand the data and to make any modifications to code.
  • Extensively worked on reading of data from DB2, Netezza and loading of data into Confidential data mart.
  • Implemented Target Load Plan and Pre-Session and Post Session Scripts.
  • Prepared the recovery process in case of workflow failure due to database issues or network issues.
  • Conduct a thorough code review and ensure that the outcome is in line with the objective and all the processes and standards are followed.

Environment: Confidential 15/14, HIVE, TASM, Mload, Fast load, Tpump, Fast export, Tableau Desktop 8.0, Confidential Parallel Transporter (TPT), BTEQ, Confidential SQL Assistant, SAS v9.3/9.4, Confidential SQL Assistance, Confidential Utilities,Hadoop, Informatica Power Center 10.1/9.5,UC4, UNIX, Autosys, Business Objects XI R2, Oracle 11g, Netezza, Linux, Korn shell.

Confidential, CA

Senior Confidential / ETL Informatica Power center Developer

Responsibilities:

  • Involved in meetings with Business Analyst’s on a data warehouse initiative responsible for requirements gathering, preparing mapping document, designing ETL flow, building complex ETL procedures, developing strategy to move existing data feeds into the Data Warehouse (DW) and additional target data warehouses.
  • Collaborated with data architects, BI architects and data modeling teams during data modeling sessions.
  • Worked extensively with Confidential utilities - Fast load, Multi load, Tpump, Confidential Parallel Transporter (TPT) to load huge amounts of data into Confidential database.
  • Extensively used Fast export to export data out of Confidential tables.
  • Created BTEQ scripts to invoke various load utilities, transform the data and query against Confidential database.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, joins and hash indexes in Confidential database.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Confidential SQL Assistant and Query man.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Confidential as well as Oracle and also in Crystal reports.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Wrote Confidential Macros and used various Confidential analytic functions.
  • Good knowledge on Confidential Manager, TDWM, PMON, DBQL.
  • Extensively used SQL Analyzer and wrote complex SQL Queries using joins, sub queries and correlated sub queries.
  • Worked inTableau environment to create dashboards like Yearly, monthly reports usingTableau desktop & publish them to server.Converted Excel Reports toTableau Dashboard with High Visualization and Good Flexibility.
  • Extensively used Informatica transformations - Source qualifier, expression, joiner, filter, router, update strategy, union, sorter, aggregator and normalizer transformations to extract, transform and load the data from different sources into DB2 and Oracle targets.
  • Extensively used ETL Informatica to integrate data feed from different 3rd party source systems - Claims, billing, payments and load into Confidential database.
  • Extensively worked on performance tuning of Informatica mappings.
  • Extensive knowledge on defect tracking tools - TRAC.
  • Extensively used RMS as version control management tool.
  • Extensively used Power Exchange for Mainframe to read data from mainframe / VSAM/ COBOL files and load into Confidential tables.
  • Constructed highly optimized SQL queries and Informatica mappings to transform data as per business Rules and load it into target databases.
  • Extensively used Control-M and UC4 scheduling tool to load the charts and run the jobs for initial load of the tables whenever a new environment is created.
  • Scheduled data refresh on Tableau Server for weekly and monthly increments based on business change to ensure that the views and dashboards were displaying the changed data accurately.
  • Prepared implementation documents for every release, worked on initial loads and data catchup process during implementations and provided on-call support for first few days of execution.
  • Built UNIX Linux shell scripts for running Informatica workflows, data cleansing, purge, delete, and data loading and for ELT process.
  • Extensively used sed, awk commands for various string replacement functionalities
  • Knowledge on Business objects, updated existing universe with the new data objects and classes using universe designer, built joins between new tables, used contexts to avoid loops and Cartesians.
  • Created Procedure Document for Implementation procedures for every release for all the UNIX, Informatica Objects and if any catch-up process needed to be done.
  • Provided on-call support for the newly implemented components.

Environment: Informatica Power Center 9.6/ 9.5, Confidential 14/13, Fast load, Multiload, Tpump, Fastexport, Confidential SQL assistant,TASM, BTEQ, SQL Developer, ERWIN, PL/SQL, RMS, Linux, AIX, Netezza, Confidential Parallel Transporter (TPT).

Confidential

Confidential Developer

Responsibilities:

  • Involved in understanding the Requirements of the End Users/Business Analysts and Developed Strategies for ETL processes.
  • The project involved extracting data from various sources, then applying the transformations before loading the data into target (warehouse) Stage tables and Stage files.
  • Worked on Informatic power center tools-Source Analyzer, Warehouse designer, Mapping Designer, Transformation Developer.
  • Created the mappings using transformations such as the Source Qualifier, Aggregator, Expression, Lookup, Router, Filter, and Update Strategy.
  • Configured the source and target database connections using. dbc files
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Confidential RDBMS.
  • Worked with DBA team to ensure implementation of the databases for the physical data models intended for the above data marts.
  • Created proper Confidential Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Confidential NUSI for smooth (fast and easy) access of data.
  • Adapted Agile software development methodology to ETL the above data marts.
  • Created mapping documents from EDS to Data Mart. Created several loading strategies for fact and dimensional loading.
  • Designed the mappings between sources (external files and databases) to Operational staging targets.
  • Created proper Confidential Primary Indexes (PI) taking into consideration of both planned access of data and even distribution of data across all the available AMPS. Considering both the business requirements and factors, created appropriate Confidential NUSI for smooth (fast and easy) access of data.
  • Did the performance tuning for Confidential SQL statements using Confidential Explain command
  • Worked heavily with various built-in transform components to solve the slowly changing dimensional problems and creating process flow graphs using Ab Initio GDE and Co-Operating System.
  • Analyzed the Data Distribution and Reviewed the Index choices
  • Tuning of Confidential SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes
  • Extensively worked under the UNIX Environment using Shell Scripts.

Environment: Confidential RDBMS, BTEQ, Fast Load, Multiload, Fast Export, Confidential Manager, Confidential SQL Assistant, Rational Clear Quest, UNIX, MQ, NDM, FTP

Confidential

ETL Developer/ Business Objects Developer

Responsibilities:

  • Worked with the Business Analysts for requirements gathering, business analysis, testing, and project coordination.
  • Involved Documenting functional specifications and other aspects used for the development of ETL mappings.
  • Documented user requirements, translating requirements into system solutions and developing implementation plan.
  • Developing a number of Complex Mappings, Mapplets and Reusable Transformations using Informatica Designer to facilitate daily and monthly loading of data.
  • Developed core components of the project which includes XML, Validation of XSD and created well defined Views in Pre-Staging area and Load them
  • Involved in designing, developing and testing of the ETL (Extract, Transformation and Load) strategy to populate the data from various source systems feeds using MSBI (SSIS).
  • Optimized Performance of existing Informatica workflows.
  • Scheduled Informatica Workflows using workflow manager.

Environment: Oracle 9i, SQL Server 2000, DB2, Informatica Power Center 7.1, SSIS, Erwin, Cognos, XML, Windows NT/2000, Unix.

We'd love your feedback!