Sr. Informatica Developer Resume
New York, NY
PROFESSIONAL SUMMARY:
- Over 9 years of IT experience in Analysis, Design, Development, Implementation, Testing and Trouble shooting in the area of Data Warehousing solutions using Informatica Toolset and SQL.
- Extensive ETL (Extract Transform Load) experience in Data Integration, Data Warehousing, Data Migration, Data Management and Data Cleansing using Informatica and SSIS.
- Expertise in Data Warehousing Concepts like Ralph Kimball Methodology, Bill Inmon Methodology, Star Schema, Snow Flake Schema, Fact Table, Dimension Table, OLAP/OLTP, Logical Data Modeling, Physical Modeling, Dimension Data Modeling, multidimensional modeling, Data profiling and data cleansing.
- Extensive knowledge in Business Intelligence, Data Warehousing Concepts and Software Development Life Cycle (SDLC).
- Solid experience in designing and developing complex mappings to extract data from various legacy diverse sources including CSV files, flat files, fixed width files, delimited files, XML files, web services, Teradata, Oracle, MS SQL Server, DB2, Netezza, Sybase, Salesforce, SAP and FTP into a common reporting and analytical Data Model using Informatica Power Center and SSIS.
- Has good experience in Facets Data model and Facets table structure. Understanding of Claim/Provider/Membership data mart.
- Solid Experience in Designing, Configuring, Customizing, Developing and Administrating ETL solutions using Informatica Toolset.
- Proficiency in design and developing the ETL objects using Informatica Powercenter with various Transformations like Joiner, Aggregate, Expression, SQL, Lookup, Filter, Update Strategy, Stored Procedures, Sorter, Sequence, Generator, Router, Rank, normalizer, B2B transformations etc.
- Good experience working withInformaticaData Quality (IDQ) toolkit, Analysis, data cleansing, data matching, data conversion, address standardization, exception handling, and reporting and monitoring capabilities of IDQ.
- Experience working withInformaticaData QualityDeveloper/Analyst Tools to remove the noises of data using different transformations like Standardization, Merge and Match, Case Conversion, Consolidation, Parser, Labeler, Address Validation, Key Generator, Lookup, Decision etc.
- Widespread knowledge and experience on Data Analysis and Data Profiling using SQL.
- Experience working with data anonymization, healthcare claims, healthcare enrollments, electronic health records, pharmacy data and other data sets involving HIPAA and high compliance standards.
- Implemented slowly changing dimension types (I, II &III) methodologies for accessing the full history of accounts and transaction information.
- Designed and developed change data capture solutions (CDC) for the project which captures and analyses changes from daily feeds to maintain history tables.
- Extensively worked on Teradata Viewpoint to look at performance Monitoring and performance tuning.
- Solid experience in programming with SQL, PL/SQL (Stored Procedures, Functions, Cursors, and Database Triggers), Bteq, and T - SQL.
- Excellent knowledge and experience in creating and performance tuning the high volume Databases, Tables, Stored Procedure, DDL/DML Triggers, Views, User defined data types, Cursors and Indexes.
- Strong SQL experience in Teradata from developing the ETL with Complex tuned queries including analytical functions and BTEQ scripts.
- Experienced with identifying Performance bottlenecks and fixing code for Optimization in Informatica and Oracle.
- Involved in requirement gathering from user through meetings, understanding the business from end user and technical perspective.
- Experience with industry standard methodologies like Waterfall, Agile and Scrum methodology within the Software Development Life Cycle (SDLC).
- Results oriented, self-starter looking for challenges, ability to rapidly learn and apply new technologies and good interpersonal skills.
- Excellent written and verbal communication skills and experience in interacting with business owners both formally and informally.
TECHNICAL SKILLS:
ETL Tools: Informatica 10.1.1/10.1.0/10/9.6.1/9.6/9.1/9/8/7, Informatica Cloud, Informatica B2B Informatica Power Exchange, Informatica Big Data Edition 9.6, Informatica IDQ, MS SSIS
2012/2008:
Relational Databases: Teradata 14.10/14/13.10/13, Oracle 11g/10g/9i, MS SQL Server 2014/2012/2008 Netezza, DB2, MySQL, MS Access
Data Modeling: Star-Schema & Snowflake-FACT and Dimension Tables.
Scripting Languages: Bteq, Pl/Sql, T-SQL, Shell, Perl
Scheduling Tools: Tidal, Autosys, Control-M
DB Query Tools: Toad, SQL Developer, SQL Assistant, SQL Plus
Version Control Tools: Virtual-Source Safe, SVN, TFS, WinCVS
Business Management Tools: MS Office, MS Excel, MS Visio
Environments: UNIX (Sun Solaris, HP-UX, AIX), Linux, Windows 7/2003/XP/2000/NT/98/95
Others: File Zilla, MKS Tool kit, Putty, WinSCP, Facets DB
PROFESSIONAL EXPERIENCE:
Confidential - New York, NY
Sr. Informatica Developer
Responsibilities:
- Involved in all phases of PDLC from requirement, design, development, testing, administering, training and rollout to the field user and support for production environment.
- Analyzing the requirements and making functional specification by discussing with business user groups and translate the Business Requirements and document the source-to-target mappings and ETL specifications.
- Involved in end to end design and development which includes high level design, code development and deployment.
- UsedInformaticaB2Bdata exchange to handle EDI (Electronic Data Exchange) for handling the payments for the scheduled dates.
- Worked withFacets DB and tables structure to retrieve data from CMC tables for Claim, Membership and Provider.
- Worked withInformaticacloudfor creating source and target objects, developed source to target mappings.
- Worked on profiling the source data to understand and perform Data quality checks usingInformaticaData Quality and load the cleansed data to landing tables.
- Developing as well as modifying existing mappings for enhancements of new business requirements mappings to load into staging tables and then to target tables in EDW. Also created mapplets to use them in different mappings
- Extensively worked on Performance tuning to increase the throughput of the data load (like read the data from flat file & write the data into target flat files to identify the bottlenecks).
- Worked on Error handling and performance tuning in Teradata queries and utilities.
- Writing SQL Scripts to extract the data from Database and for Testing Purposes.
- Interacting with the Source Team and Business to get the Validation of the data.
- Involved in resolving issue tickets related data issues, ETL issues, performance issues, etc.
- Used Unix Schell scripting to automate several ETL processes.
- Ongoing support has been provided for production issues by looking after the errors in daily loads and played an active role in resolving them.
- Developed unit test cases and did unit testing for all the developed Informatica mappings.
- Maintained Development, Test and Production mapping, migration using Repository Manager also used Repository Manager to maintain the metadata, Security and Reporting.
- Ensured compliance with development methodology and technical process.
- Conduct business requirement review sessions with IT and Business teams to understand what the outcome should be based on data elements.
- Writing scripts for data cleansing, data validation, data transformation for the data coming from different source systems.
Environment:Informatica Power Center 10.1.1/10.1.0/10/9.6.1, Informatica IDQ, MS SQL Server, Facets DB, Informatica B2B Exchange, SQL Assistant, Git, Unix, Autosys r11.1, SAS, Tableau 9/10, Flat Files.
Confidential - Louisville, KY
Sr. ETLDeveloper
Responsibilities:
- Actively involved in understanding business requirements, analysis and design of the ETL process.
- Used Informatica Power Center to extract, transform and load data from various source systems to staging and target Data warehouse.
- Used Power Center Workflow Manager for session management, database connection management and scheduling of jobs to be done in batch process.
- Created fact and dimension tables based on specifications provided by managers using Star Schema.
- Designed and created complex source to target mappings using various transformations inclusive of but not limited to Aggregator, Look Up, Joiner, Source Qualifier, Expression, Sequence Generator, and Router Transformations.
- Tuned the performance of mappings by following Informatica best practices and also applied several methods to get best performance by decreasing the run time of workflows.
- Used Type 1 SCD and Type 2 SCD mappings to update slowly Changing Dimension Tables.
- Modified several of the existing mappings based on the user requirements and maintained existing mappings, sessions and workflows.
- Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail of the workflow.
- Extensively worked on Mapping Variables, Mapping Parameters and Session Parameters.
- Created Data Breakpoints and Error Breakpoints for debugging the mappings using Debugger Wizard.
- Used Change Data Capture (CDC) to simplify ETL in data warehouse applications.
- Written UNIX scripts to transfer the data from operational data sources to the target warehouse.
- Developed PL/SQL procedures, functions, triggers for processing business logic in the database. Wrote optimized SQL queries for data retrieval from the source database.
- Worked with OBIEE team on Fact and Dimension tables for reporting Purposes.
- Created MUD (Multiple user Development) environment by utilizing projects in OBIEE.
- Gathered the requirement, completed the proof of concept, Designed, Developed and Tested Physical Layer, Business Layer and Presentation Layer of OBIEE.
- Performed the setting up, Installation and configuration of the complete Analytics platform environment and the required connectivity for seamless integration with the data warehouse.
- Performed migration and merging of RPD's in OBIEE.
- Worked with pmcmd to interact with Informatica Server from command mode and execute the shells scripts.
- Monitoring the session logs to check the progress of the data load.
- Involved in different Team review meetings.
Environment: Informatica Power Center 9.1, Oracle 11g, PL/SQL, UNIX shell scripts, OBIEE Analytics 7.8.4, TOAD, windows XP, Linux, Erwin 4.X.
Confidential - Boise, ID
ETLDeveloper
Responsibilities:
- Involved in All the phases of the development like Analysis, Design, Coding, Unit Testing, System Testing and UAT.
- Involved in Data Migration from MS SQL server to Teradata.
- Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions type 2 and type 1.
- Mostly worked on Dimensional Data Modelling, Star Schema and Snowflake schema modelling.
- Worked on the various enhancements activities, involved in process improvement.
- Worked on Change Data Capture (CDC) using CHKSUM to handle any change in the data if there is no flag or date column present to represent the changed row.
- Worked on reusable code known as Tie outs to maintain the data consistency. It compared the source and target after the ETL loading is complete to validate no loss of data during the ETL process.
- Worked on Power Exchange bulk data movement process by using Power Exchange Change Data Capture (CDC) method, Power Exchange Navigator, Power Exchange Bulk Data movement. Power Exchange CDC can retrieve updates at user-defined intervals or in near real time.
- Worked onInformaticaB2B Data exchange to parse hl7 files.
- Created and modified several UNIX shell Scripts according to the changing needs of the project and client requirements.
- Involved in the continuous enhancements and fixing of production problems.
- Created PL/SQL stored procedures, functions and packages for moving the data from staging area to data mart.
- Created and reviewed scripts to create new tables, views, queries for new enhancement in the application using TOAD.
- Extensively involved in Unit, Integration and System testing and responsible for fixing the Program defects.
- Created indexes on the tables for faster retrieval of the data to enhance database performance.
- Supported the development and production support group in identifying and resolving production issues.
- Developed wrapper shell scripts for calling Informatica workflows using PMCMD command and Created shell scripts to fine tune the ETL flow of the Informatica workflows.
- Implemented Teradata MERGE statements in order to update huge tables thereby improving the performance of the application.
Environment: Informatica 9.1(Designer, Repository Manager, Workflow Manager, Workflow Monitor), Windows, Oracle 11G, Teradata 14/13.10, Tableau, UNIX, Putty, PL/SQL Developer, Power Exchange.
Confidential - Long Beach, CA
ETL Developer
Responsibilities:
- Worked as ETL developer and widely involved in Designing, development and debugging of ETL mappings using Informatica designer tool.
- Build and maintain SQL scripts, indexes, and complex queries for data analysis and extraction.
- Extracted datafrom Oracle database and loadedit into SET and MULTISET tables in the Teradata database by using various Teradata load utilities, Also Transferred large volumes of data using Teradata FastLoad, MultiLoad, and T-Pump.
- Developed Teradata BTEQ scripts to implement the business logic and work on exporting data using Teradata FastExport.
- Performed unit, integration and system level performance testing. Associated with production support team in various performances related issues.
- Created mappings using Aggregator, Expression, Joiner, Filter, Sequence, Procedure, Connected & Unconnected Lookup, Filter and Update Strategy transformations using Informatica Power center designer.
- Extensively used ETL to load data from different sources such as flat files, XML, Oracle to Teradata Database.
- Implemented advanced geographic mapping techniques and use custom images and geo coding to build spatial visualizations of non-geographic data.
- Worked with all levels of development from analysis through implementation and support.
- Developed reports that deliver data from cubes.
- Responsible for ongoing maintenance and change management to existing reports and optimize report performance.
- Provided 24/7 On-call Production Support for various applications and provided resolution for night-time production job abends, attend conference calls with business operations, system managers for resolution of issues.
- Worked on mapping parameters and variables for the calculations done in aggregator transformation.
- Tuned and monitored in Informatica workflows using Informatica workflow manager and workflow monitor tools.
- Designed and Implemented the ETL Process using Informatica power center.
- Developed mapping parameters and variables to support SQL override.
- Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
- Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session.
- Involved in Performance tuning at source, target, mappings, sessions, and system levels.
- Exhaustive testing of developed components.
- Worked on the various enhancements activities, involved in process improvement.
- Worked on Ab Initio in order to replicate the existing code to Informatica.
- Performed unit testing at various levels of the ETL and actively involved in team code reviews.
- Created shell scripts to fine tune the ETL flow of the Informatica workflows.
- Migrated the code into QA (Testing) and supported QA team and UAT (User).
- Working with Power Center Versioning (Check-in, Check-out), Querying to retrieve specific objects, maintaining the history of objects.
Environment: Informatica 7.1, MS SQL Server 2008, Windows, Linux, Oracle 9i, Control-M, SVN, FTP.