We provide IT Staff Augmentation Services!

Sr.etl/informatica Developer Resume Profile

Chicago, IL

Profile:

  • 7 Years of IT Experience
  • 7 Years of Informatica Experience

Summary:

  • 7 Years of IT experience in Data Warehouse/Data Mart design, System analysis, development, Database design, SQL, PL/SQL programming.
  • Expertise in implementing complex Business rules by creating robust mappings, mapplets, and reusable transformations using Informatica Power Center.
  • Experience in Data Warehouse development working with Extraction/Transformation/Loading using Informatica Power Mart/Power Center with flat files, Oracle, SQL Server, and Teradata.
  • Experience in working with Informatica Power Center 9.1/8.6.1/8.5.1/8.1.1/7.x/6.x, Designer , Work Flow Manager, Work Flow Monitor, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplet Designer, Mapping Designer, Workflow Designer, Task Developer, Worklet Designer, Gant Chart, Task View, Mapplets, Mappings, Workflows, Sessions, Re-usable Transformations, Shortcuts, Import and Export utilities.
  • Very strong in SQL and PL/SQL, extensive hands on experience in creation of database tables, triggers, sequences , functions, procedures, packages, and SQL performance-tuning.
  • Experience working in multi-terabytes data warehouse using Databases like Oracle 11g/10g/9i, MS Access 2000/2002, XML ,IBM UDB DB2 8.2, SQL Server 2008 ,MS Excel and Flat files
  • Experience Relational Modeling and Dimensional Data Modeling using Star Snow Flake schema, De normalization, Normalization, and Aggregations.
  • Proficiency in data warehousing techniques like data cleansing, Slowly Changing Dimension phenomenon, Surrogate key assignment, change data capture.
  • Have Good understanding of ETL/Informatica standards and best practices, Slowly Changing Dimensions SCD1,SCD2,SCD3
  • Experience in testing coordination ,writing test cases and executing test scripts ,And logged defects in Quality Center QC
  • Created personalized version of reports as well as statements for customers using the data from Informatica metadata and then generated Business Objects reports using slice and dice capabilities.
  • Experience with Data Extraction, Transformation, and Loading ETL from disparate Data sources, Multiple Relational Databases like Oracle, DB2-UDB and Worked on integrating data from flat files, CSV files, and XML files into a common reporting and analytical Data Model using Erwin
  • Worked extensively in various kinds of queries such as Sub-Queries, Correlated Sub-Queries, and Union Queries for Query tuning.
  • Experience with coordinating and leading onsite-offshore development.

TECHNICAL SKILLS:

ETL Tools

Informatica 9.0/8.6.1/8.5.1/8.1.1/7.1.4/6.2/5.1

Power Center/Power Mart , SQL Server DTS

Reporting Tools

Business Objects, crystal reports, cognos

Dimensional Data Modeling

Dimensional Data Modeling, Data Modeling, Star Join Schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 3.5.2/3.x, Oracle Designer.

Databases

Oracle 11g/10g/9i/8i/8.0/7.x, Stored Procedures, Triggers, Packages, Functions and Materialized Views MS SQL Server 2008/2005, DB2 UDB, MS Access 2000, Sybase

Programming GUI

Java, Toad, SQL Navigator, Harvest, Autosys, HTML 4.0, DHTML, XML, SQL, PL/SQL, Rational Clear Quest, Clear Case, Unix Shell Scripting

Environment

Sun Solaris 2.6/2.7/2.8/8.0, HP-UX 10.20/9.0, IBM AIX 4.2/4.3, MS DOS 6.22, Novell NetWare 4.11/3.61, Win 3.x/95/98, Win NT, Red Hat Linux, Sun-Ultra, Sun-Spark, Sun Classic, SCO Unix, HP9000, RS6000, Win 3.x/95/98, Win 2000/XP/2003, MS-Dos

Professional Experience:

Confidential

Sr.ETL/Informatica Developer

Responsibilities:

  • Identified the exact business requirements by interacted with the business analysts and other management through JAD sessions and brought up the exact requirements.
  • Followed agile methodology during the development process of the data designing.
  • Designed the conceptual models for the flow of data between different systems.
  • Added enhancements to the data model by following Star schema using Ralph Kimball methodology.
  • Applied data profiling techniques to analyze the content, quality and structure of data from different sources to make initial assessments.
  • Provided with data cleansing techniques that can be used for multiple systems including billing, customer service centers and e-channels.
  • Applied source to target mapping and generated mapping matrix for transformation.
  • Provided sophisticated data management capabilities to ensure consistency and integrity of data for a demanding piece of legislation such as SOX compliances.
  • Performed the customer profiling using number of different classifications which helped the organization to target relevant customers with product and service offers, which helped them to retain the existing customers and add the new customers.
  • Managed the Meta data which controlled the flow of the data to different systems helped the organization to control the fraud and achieved solution for fraud detection in cases of subscription fraud by building the customer behavior profile.
  • Debugged the mapping in Informatica using debug wizard to observe the flow of data using different test case for different types of data.
  • Implemented pipeline Partitioning hash key, key range, round robin and pass through to improve session performance.
  • Extensively used the capabilities of Power Center such as File List, Pmcmd, Target Load Order, Constraint Based Loading, Concurrent Lookup Caches etc.
  • Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance.
  • Created scripts in UNIX for migration of data between the sources and the target data bases.

Environment: Oracle 11g, SQL Server, Toad, ER Studio, Informatica 9.0.1, UNIX Shell Script,Teradata,Aqua studio.

Confidential

Sr.ETL/Informatica Developer

Responsibilities:

  • Developed mappings/Reusable Objects/Transformation/mapplets by using mapping designer, transformation developer and mapplet designer in Informatica Power Center 9.1.
  • Designed and developed mappings using Source Qualifier, Expression, connected-Lookup and unconnected-Lookup, Router, Aggregator, Filter, Sequence Generator, Stored Procedure, Update Strategy, joiner Rank transformations.
  • Experience working on KPIs key Performance Indicators in software development projects on CPP platform.
  • Worked in different claim amounts which generate the Revenue for NYSOMH, which include a wide range of Revenue generating categories from different parts of state.
  • The outcome of the measurement is used to initiate further process adjustments and improvements. In addition, there is possibility to perform benchmarking between different development projects, and based on collected data easier search for best practices in the projects that can be broadly implemented.
  • Coordinated in setting up the development, test, production and contingency environment
  • Coordinated with the Informatica administration team during deployments.
  • Working closely with users/developers and administrators to resolve the production problems by reviewing design changes.
  • Worked on Data Flux along with web services which quickly integrate into business processes, applications and websites providing virtually seamless functionality that improves productivity and efficiency while reducing maintenance and management costs.
  • Knowledge of FTP and HIPAA compliant ANSI formatted EDI transactions, and thorough knowledge of security in handling PHI secure Health data transactions.
  • Implemented Type 2 Slowly Changing Dimensions to maintain the historical data in Data mart as per the business requirements.
  • Proficient in using Informatica Data Explorer IDE 9, Informatica Data Quality IDQ 9 and Data Analyst.
  • Involved in Data migration to a new server by decommissioning the existing production Servers.
  • Involved in working with MDM project across the global enterprise through increased accuracy, reliability, and timeliness of business-critical data.
  • Worked with Informatica Power Exchange 8.6.1 which works in conjunction with PowerCenter9.1 to capture changes to data in source tables and replicate those changes to target tables and files.
  • Worked with Power Exchange for CDC techniques for relational database sources on UNIX, and Windows operating systems.
  • Created SAS Document for Configuration team to execute the workflows in the order to move the data from Dev box to QAT environments for testing purposes.
  • Exclusively worked with BI/QA teams in troubleshooting defects and escalating them to Architects and High level in case of defects in design and also worked in Data Quality Tickets for improving the performance in case of slowly running queries.
  • Worked with existing OWB and ODI data ETL and Data Integrator tools to better accommodate the data with the existing Databases into the new ones.
  • Regularly involved in Scrum Meetings for one project, and also involved in Agile for other project -SDLC methodologies.
  • Used Session parameters, Mapping variable/parameters and created Parameter files for imparting flexible runs of workflows based on changing variable values.
  • Worked with Informatica Administrator to move project folders in development, test and production environments.
  • Involved in creation of various Unix Scripts and Perl Scripts which help the ETL scheduling jobs and help in initial validations of various tasks.
  • Wrote UNIX Shell Scripts for Informatica Pre-Session, Post-Session and Autosys scripts for scheduling the jobs work flows
  • Involved in Converting and gathering necessary calculation of PL SQL complex calculations into Informatica mappings.

Environment: Informatica 9.0.1, ILM, Oracle 11g, flat files, DB2, Sales force, Teradata, web services, SQL Server, Toad 9x, SQL, PL/SQL, Windows XP/2003, UNIX Shell Script.

Confidential

ETL/Informatica Developer

Responsibilities:

  • Extracted required mapping fields from different sources schema, CDW to Warehouse for Staging Table.
  • Performed constraint based loading and target load order plan and also work on pipeline partitioning.
  • Performed all Conversion goals and objectives which include identifying data for processing sales, and validating of historical sales is available in CDW for migrating the data to Sales Force.
  • Analyzed and performed Data mapping which involves identify source data field, identify target entities and their lookup table id's, and translation rules. Evaluation of data groups and data critically and developed automated conversion process.
  • Loading ISM file, converting ISM file to flat file then loading flat file and flat file to staging area and running masking.
  • Preformed QA Testing for all the Store Procedures to validate the sales results.
  • Increased query performance by 20 , which is necessary for statistical reporting after monitoring, tuning, and Optimizing Indexes by using Performance Monitor and SQL Profiler. Reduced and obliterated unnecessary joins and indexes.
  • Created Packages by testing and cleaning the Standardized Data by using tools in Data Flow Transformations Data Conversion, Export Column, Merge join, Sort, Union all, Conditional Split and more for existing/new packages and huge CSV file import to Sale Force from different data sources and other ongoing tasks for mapping.
  • Developed and modified the existing reports from basic Chart and tabular to parameterized reports including Single/Multi/Boolean valued parameters and linked reports based on new business logic.
  • Rendered reports in different formats like pdf, Excel etc to be executed automatically on weekly or monthly basis and managed subscriptions.
  • Documented all database objects, procedures, views, functions packages for future references, performed unit testing and match results for different environments.

Environment: Informatica 9.0.1, ILM, Oracle 11g, flat files, DB2, Sales force, Teradata, web services, SQL Server, Toad 9x, SQL, PL/SQL, Windows XP/2003, UNIX Shell Script.

Confidential

Data Analyst/Sr. Informatica/Data Flux Developer

Data Analyst Responsibilities:

  • Analyzed Different SAP ECC modules like SAP FI/CO, SAP SD, PP, PM, HR and MM modules in SAP R/3 and understand the relationships by analyzing the Source systems.
  • Worked on specifications given by the Data governance team and Data quality team that required managing the master data from all the business units and ensuring data quality across the enterprise.
  • Worked on SAP modules and SAP BW for analyzing and understanding the data flow from SAP to EIW, Data warehouse and Data marts
  • Drive and assist with data analysis to improve operation including uncovering data anomalies and performing research of other forms of key operational data to enable efficiencies and overall data quality
  • Conducted workflow, process diagram and GAP analysis to derive requirements for existing systems enhancements.
  • Performed data cleansing/scrubbing for removing incorrect data in a database, incomplete, improperly formatted, or duplicated for ensuring the consistency in the target system.
  • Worked on designing to Built Physical Models for SAP MDM and also worked on SAP MDM Data Manager.
  • Applied Master Data Management to create and maintain consistent, complete, contextual, and accurate business data for all stakeholders.
  • Data Integration / ETL Responsibilities:
  • Design and develop the semantic layer for the reporting purpose so that the end user reporting may not be affected.
  • Determined the project scope using business requirements by conducting the number of sessions during envisioning phase.
  • Worked closely with SME's and the off shelf team to analyze the areas that would be affected and needs good attention.
  • Captured the changes in the data field and analyzed the impacts on different systems with the introduction of Oracle's off shelf product.
  • Created analytical view for key business requirement such as revenue generation and auto warranty renewals.
  • Derived the data mappings that could load all the information from sources to the targets.
  • Designed the Informatica source from mainframe system to target mappings to perform the initial load into the production server.
  • Efficiently utilized Transformations including Expression, Router, Joiner, and Connected and Unconnected lookups, Filter, Aggregator, Rank, Sorter and Sequence Generator in the mappings.
  • Developed scripts in Unix Shell programming for Data Import/Export, Data Conversions and Data Cleansing methodologies.
  • Provided with complex workflows that invoke the designed mappings in Informatica power center.
  • Implemented and extensively worked on slowly changing dimensions Type1, Type2 and Type3 for accessing transaction history of accounts and transaction information using Change Data Capture CDC corresponding to business requirements.
  • Resolved quality issues using data profiling and data mapping functionalities using Informatica data explorer.
  • Tuned Mappings, Sessions, and SQL for better performance by eliminating various performance bottlenecks.

Data Flux Responsibilities:

  • Worked with dfPower studio and data integration studio features to inspect data for errors, inconsistencies, redundancies, and incomplete information and then correct, parse, standardize, match and cluster data.
  • Created job profiles for Frequency Distribution, Pattern analysis, primary/foreign key analysis, address verification, redundant data analysis of the available data in database.
  • Worked on collecting, storing, profiling, and managing the various data quality, integration logic, and business rules that are generated and created by using dfPower Studio 8.2.
  • Used data flux Architect to define sequence of operations such as selecting the data from a database, parsing data, verifying the address data and then outputting the data into a new table.
  • Using the data flux navigator customized the core knowledge libraries Quality Knowledge Base that drive the dfPower Studio engine, allowing situational data issues to be addressed.
  • Extensively worked on creating jobs using data integration studio that uses a Sort transformation to sort the data in a source table and write it to a target table.
  • Used Macro Variables for Paths for File input/output nodes Text File input Text File Output , path for all embedded job nodes, data source, SQL query, Data Target Insert, Data Target Update which makes to move the code from one environment to another.
  • Worked on maximizing the performance of architect jobs by following standards from Data flux.
  • Used PVCS as a source control tool to check in items such as QKB files, Architect job files, Profile job files, Server configuration files and Management resources directory.
  • Experienced in using DIS to deploy production batch jobs.

Environment: Informatica Power Center 9.0.1, df power studio 8.2, Oracle 10g,PL/SQL, SQL Plus, flat files, web services, UNIX, Windows NT, BMC remedy, Erwin 7.x, Toad 10.

Confidential

ETL Informatica Developer

Responsibilities:

  • Design and develop the semantic layer for the reporting purpose so that the end user reporting may not be affected.
  • Determined the project scope using business requirements by conducting the number of sessions during envisioning phase.
  • Worked closely with SME's and the off shelf team to analyze the areas that would be affected and needs good attention.
  • Captured the changes in the data field and analyzed the impacts on different systems with the introduction of Oracle's off shelf product.
  • Created analytical view for key business requirement such as revenue generation and auto warranty renewals.
  • Coded Teradata macros for standard update /create views/drop tables.
  • Derived the data mappings that could load all the information from sources to the targets.
  • Designed the Informatica source from mainframe system to target mappings to perform the initial load into the production server.
  • Analyze data and create detailed reports utilizing proprietary tools and third party technologies to identify and resolve data quality issues to ensure quality assurance and drive data quality initiatives.
  • Devised strategies of extracting data from UNIX to Staging and then from Staging to Teradata RDBMS.
  • Efficiently utilized Transformations including Expression, Router, Joiner, and Connected and Unconnected lookups, Filter, Aggregator, Rank, Sorter and Sequence Generator in the mappings.
  • Developed scripts in Unix Shell programming for Data Import/Export, Data Conversions and Data Cleansing methodologies.
  • Provided with complex workflows that invoke the designed mappings in Informatica power center.
  • Implemented and extensively worked on slowly changing dimensions Type1, Type2 and Type3 for accessing transaction history of accounts and transaction information using Change Data Capture CDC corresponding to business requirements.
  • Resolved quality issues using data profiling and data mapping functionalities using Informatica data explorer.
  • Tuned Mappings, Sessions, and SQL for better performance by eliminating various performance bottlenecks.

Environment: Informatica Power Center 9.0.1, Autosys, Oracle 10g, PL/SQL, SQL Plus, flat files, web services, UNIX, Windows NT, BMC remedy, Erwin 7.x, Toad 10.

Confidential

ETL Informatica Developer

Responsibilities:

  • Responsible for complete life cycle implementation that encompassed Business Requirement gathering, analyzing their source systems and then building a new data mart in order to provide functionality for their reporting purposes.
  • Understand and articulate business requirements from user interviews and then convert requirements into technical specifications Documents.
  • Implemented data model for ABAC Automated Balance and Audit Controlling process.
  • Extensively used ERWIN to design logical, physical and domain models and visually represented in diagrams using Information Engineering IE notation.
  • Extensively implemented and governed the corporate naming standards according to the company-specified standards.
  • Identified all the conformed dimensions to be included in the target warehouse design and confirmed the granularity of the facts within the fact tables.
  • Used accumulating snapshot grain to handle multiple date keys and late arriving fact rows.
  • Did data profiling in order to understand the data and how different entities relate to each other.
  • Interacted with SME to accumulate knowledge about business processes and documented those for future purposes.
  • Was responsible for all design reviews of mapping according to High level design Mapping Documentation and standards followed for project, tuned INFORMATICA sessions to increase the cache size, target based commit interval.
  • Performed Data Migration where the relational data is extracted from different sources Flat files / Excel sheets / XML files , transformed and loaded into the target Oracle database without any data loss.
  • Transformations like Joiner, Stored procedure, Router, Aggregator, Source Qualifier, Lookup, Expression etc. are used.
  • Data Profiling, Data cleansing and Data scrubbing operation are performed while transforming using the Expression transformation.
  • Created mapplets, workflows and sessions using source, target and transformations to run the mappings in the desired order.
  • Performed database tuning to improve performance by creating and modifying table space.
  • Performed Unit testing, User Acceptance Test and also documented test cases for UAT.

Environment: Informatica Power Center 8.1.1, SQL Server 2008/2005, Business Objects, UNIX shell Scripting, Erwin, Rational Architect, Toad.

Confidential

ETL Informatica Developer

Responsibilities:

  • Identified the exact business requirements by interacted with the business analysts and other management through JAD sessions and brought up the exact requirements.
  • Followed agile methodology during the development process of the data designing.
  • Designed the conceptual models for the flow of data between different systems.
  • Added enhancements to the data model by following Star schema using Ralph Kimball methodology.
  • Applied data profiling techniques to analyze the content, quality and structure of data from different sources to make initial assessments.
  • Provided with data cleansing techniques that can be used for multiple systems including billing, customer service centers and e-channels.
  • Applied source to target mapping and generated mapping matrix for transformation.
  • Provided sophisticated data management capabilities to ensure consistency and integrity of data for a demanding piece of legislation such as SOX compliances.
  • Performed the customer profiling using number of different classifications which helped the organization to target relevant customers with product and service offers, which helped them to retain the existing customers and add the new customers.
  • Managed the Meta data which controlled the flow of the data to different systems helped the organization to control the fraud and achieved solution for fraud detection in cases of subscription fraud by building the customer behavior profile.
  • Debugged the mapping in Informatica using debug wizard to observe the flow of data using different test case for different types of data.
  • Implemented pipeline Partitioning hash key, key range, round robin and pass through to improve session performance.
  • Extensively used the capabilities of Power Center such as File List, Pmcmd, Target Load Order, Constraint Based Loading, Concurrent Lookup Caches etc.
  • Configured the sessions using workflow manager to have multiple partitions on Source data and to improve performance.
  • Created scripts in UNIX for migration of data between the sources and the target data bases.

Environment: Oracle 10g, SQL Server 2003, Toad, ER Studio, Informatica 7.1.

Hire Now