We provide IT Staff Augmentation Services!

Sr. Informatica Developer Resume

5.00/5 (Submit Your Rating)

Weehawken, NJ

SUMMARY

  • 14 years of IT experience in Data warehousing and Business intelligence with emphasis on Business Requirements Analysis, Application Design, Development, testing, implementation and maintenance of client/server Data Warehouse and Data Mart systems in the Financial, Insurance, Pharmaceutical and Telecom industries.
  • Experienced with all phases of SDLC (System Development Life Cycle) Starting from requirements gathering phase through testing phase. Participated in Requirements. Gathering phase to clearly understand the purpose of developing a Data Warehouse Solution. Information about business process, business entities is gathered. Diagrams. And reports including ER diagrams, process flow diagrams are produced.
  • 14 years of development and design of ETL methodology for supporting data transformations and processing, in a corporate wide ETL Solution using Informatica Power Center 10.1.1/9.6.1 / 8.6.1 / 8.1.1 /8.0/7.1/7.0 / 6.2/6.1/5.1.2/5.1.1/4.7 (Workflow Manager, workflow Monitor, Source Analyzer, Data Warehousing Designer, Mapping Designer, Mapplet Designer, Transformation developer), Informatica PowerMart 6.2/6.1/5.1.2/5.1.1/4.7 , Power Connect, PowerPlug, PowerAnalyzer, Power Exchange, Datamart, Autosys.
  • 8 years of strong Data cleansing experience using data cleansing functions like replacechr, replacestr, isdate in informatica. worked with data profiling.
  • Involved schemas in design of Dimensional Models, Star Schemas and Snowflake Schemas. Knowledge of Ralph Kimball and Bill Inman’s approaches to Data Warehousing.
  • Understanding of OLAP, OLTP, Business Intelligence and Data Warehousing concepts with emphasis on ETL and Business Reporting needs.
  • Developed test case’s for business and user requirements to perform System/Integration/Performance testing.
  • Extensively used Mercury Quality Center to load test cases, execute them and log defects found in system testing / UAT.
  • Knowledge of Java and Web services.
  • Good understanding of Dimensional and Data Modeling using Star & snowflake schemas. Experience with Fact and Dimension Tables.
  • Enhanced the Logical and Physical Data Models of the Enterprise Data Warehouse to incorporate the new requirements, by creating new subject areas.
  • Established the Entities, Attributes and Relationships and created the Logical data model (LDM) and mapped the same to the Physical data model (PDM).
  • 6 Years of database experience using Oracle 11g/10g/9i/8i/7.x, Sybase12.5, MS Access7.0/2000, SQL, PL/SQL, SQL*Plus, SQL*Loader and Developer 2000. Hands on Working knowledge in Oracle and PL/SQL Writing Stored Procedures, Packages, Functions and Triggers. Adept at working with Distributed Data Bases, SQL * Loader.
  • 6 years of UNIX shell scripting experience using shell scripts like sed, awk, grep, WC, more, etc. Worked with crontab command to schedule the jobs and ftp command to get access to a remote server.
  • Kicked off the Informatica workflows by running JCLs from Main frame environment.
  • Experienced with POWEREXCHANGE AND POWERCONNECT to connect and import sources from external systems like Mainframe (DB2, VSAM) or MQ Series, etc.
  • Used PMCD command in non - windows environment to communicate with the Informatica server, Optimized/tuned mapping for better performance and efficiency, Performance tuning of SQL Queries, Sources, Targets and sessions.
  • Experienced in providing production support on a 24/7 basis and monitored the execution of Informatica Workflows.
  • Maintained Development, Test and Production mapping migration using Repository Manager. Also used the repository Manager to maintain the metadata, security and reporting. Tuning the Informatica mapping s for optimum performance.
  • Experienced in Managing the Metadata and release management. Participated in creating the ETL Metadata for new sources/targets and importing the base line code from production for enhancements.
  • Worked extensively with the ECMS/ARM process to move the code from individual folders to the Project folder and then deploy the approved code into UAT and Production environments.

TECHNICAL SKILLS

Data Warehousing: Informatica power Center 10.1.1 / 9.6.1 /8.6.1/8.5/8.1.1/7.1.2 Informatica Power Mart 6.1/5.1/5.0/4.7 , Informatica Power Connect, Informatica Power Exchange 8.1.1, OWB and OLAP.

Dimensional Data Modeling: Dimensional Data Modeling, Data Modeling, Star schema Modeling, Snow-Flake Modeling, FACT and Dimensions Tables, Physical and Logical Data Modeling, Erwin 7.0/4.5/4.0/3.0 , Oracle Designer, VISIO

Business Intelligence: Cognos series 7, Reportnet 1.1, Cognos 8BI, Business Objects 6.0/5. 1/5.0 (Web-Intelligence 2.5, Designer 5.0, and Developer Suite & Set Analyzer 2.0), Developer 2000 (Forms 4.5/5.0, Reports 2.5/3.0), MS Access Reports

GUI: Visual Basic 5.0/6.0, Oracle Forms 6.0, Oracle Reports 6.0, VC++, FrontPage 97/98/2000, Visio and TOAD 6.0/5.0.

RDBMS: Oracle 11g/10g/9i/8i/7.0, Sybase 12.5, MS SQL Server 2005/2000/9.0/8.0/7.0 , SQL Server DTS, db2, MS Access 7.0/2000, DB2.

Tools: Developer 2000 (Forms 4.5/5.0, Reports 2.5/3.0), Erwin, MS Access Reports, SQL*Loader, SQL Navigator4.4, SQL Developer, TOAD11.5, Flash, Adobe, MQC, Autosys.

Languages: COBOL, C, C++, J2SE 1.2, FORTRAN

Environment: UNIX, MS-Dos, Sun Solaris 2.7, HP-UX, AIX 4.3.3, Windows 9X/2000/NT/XP, Sun-Ultra, Sun-Spark, Sun Classic

PROFESSIONAL EXPERIENCE

Confidential, Weehawken, NJ

Sr. Informatica Developer

Responsibilities:

  • I am the key resource in talking to end users and gathering the requirements and converting them into the ETL code and then loading the data into the final target tables so that the necessary reports are generated to the end users.
  • Extensively involved in creating the Unix scripts and Autosys JIL jobs (PRECHK, FW, etc.) to start the informatica jobs automatically.
  • Actively involved in creating the build by using Query tool and storing the Artifacts in Nexus repository while working with Confidential Deploy in moving the code to higher environments.
  • Worked on the documentation of explaining in detail the purpose of each workflow so that it is helpful to the future Developers.
  • In the Developer tool, used the Informatica 10.1 feature to develop a Dynamic mapping that reuses the same mapping logic at runtime for different sources and targets.
  • Worked with Java transformation to parsing the data with corresponding values.
  • Participated in performance enhancement in Informatica sessions.
  • Previously there was one lookup to create a source dump for 7 Instruments which is performance wise not advisable as it occupies a lot of primary memory. So I have splitted this lookup into 7 lookups to create separate source dump for each individual instruments.
  • Participated in developmental work as a part of enhancements and Defect fixing of the system.
  • Wrote complex SQL queries as SQL Override to minimize the data selection with some filters as per the requirements.
  • Extensively used the Autosys commands for starting, stopping, killing and verifying the status of the jobs throughout the entire project.

Environment: Informatica Power Center 10.1.1, Oracle11g, TOAD 12.6, PL/SQL, Cloudera Impala, Hue UNIX, FTP, SFTP.

Confidential, Herndon, VA

Sr. Informatica Developer

Responsibilities:

  • Participated in requirement analysis with the help of business model and functional model.
  • Developed parameter driven ETL process to map source systems (XML Sources) to target data warehouse with Informatica using various transformations like Expression, Router, Update strategy, Connected and Unconnected lookups, Aggregator, Sorter, Sequence generator, etc. and having files as a source and targets along with database objects.
  • Extensively worked with XML sources which are from external source system and pulled the data from multiple groups (views) and joined them as per the requirements.
  • Extensively worked with exceptional data and routed the same to another exception data store for further analysis.
  • Conducted ETL development in the Netezza environment using standard design methodologies.
  • Wrote SQL, PL/SQL codes, stored procedures and packages for dropping and re- creating indexes, to generate oracle sequences, procedures that cleans up the base automatically in case the mapping already ran, the procedure that will logically expire existing records and update specific Code, packages to incorporate business rules to some times eliminates transformation dependency
  • Developed test cases and performed unit testing after conducting peer reviews of all the workflows before moving them to higher environments.
  • Created High Level / Low level Design Documents, Application Control Documents (Supplemental documents) and Interface Control Documents and uploaded them in DOORS.
  • Extensively used the autosys commands for starting, stopping, killing and verifying the status of the jobs throughout the entire project.
  • Experienced working with rational clear case for version controlling and migrating the artifacts to higher environments.
  • Created Run Books and Release notes for the migration of the Code and performed Migration through Clear Case Tool.
  • Tracked the defects using Clear Quest tool and generated defect summary reports.
  • Attended the daily Defect calls and resolved the defects after discussing with the team members.

Environment: Informatica Power Center 9.5.1/9.6.1 , Oracle 11g, Netezza 6.x, TOAD 11.5, PL/SQL, UNIX, Rational Clear case, Clear Quest, MDM, FTP, SFTP.

Confidential, New York, NY

Sr. Informatica Developer

Responsibilities:

  • Interacted with business users from difference and operation team for requirement gathering.
  • Preparation of technical specifications from business requirements to develop and implement ETL Processes using Informatica.
  • Loading the data in stage and FACT Tables to generate, validate the reports and sent out for analysis.
  • Documented business requirements discussed issues to be resolved and translated user input into ETL design documents.
  • Worked with data analysts to implement Informatica mappings and workflows, shell scripts and stored procedures to meet business requirements.
  • Developed shell scripts for merging files, deleting temporary files, sending the files to a specified Location, etc.
  • Created interfaces using different types of mappings using various transformations like Expression, filter, router, Aggregator, Look up, Joiner, Stored Procedure, Update Strategy, Normalizer Etc.
  • Extensively worked with Mapplets to design a common logic that can be used across several modules.
  • Wrote the SQL override queries to read only the changed records from source to insert into target Dimension instead of using the cumbersome ETL Method.
  • Read and understood the PL/SQL code and converted that into ETL, Informatica code.
  • Worked with Synch Processes to detect the changed and new data from the source to insert the new records and to update the old records.
  • Worked with Fusion UI to upload the files which will trigger the whole process from loading the stage table, validating the data elements and then loading the mart with successful source records.
  • Worked with Fusion UI to kick off the events such as Sync Process, BPS Process, etc., through the Event Manager.
  • Verified/Validated the reports again through the Fusion UI and re validated the code in case of any issues.
  • Acted as the metadata coordinator and participated in importing the metadata into the project Folders so that its short cuts can used by the individual developers.
  • Worked with the ECMS/ARM process to move the code from individual folders to the Project folder and then deploy the approved code into UAT and Production environments.
  • Experienced with IRA and SVN tools to export and import the code from one Environment to other.
  • Conducted the unit testing to find and resolve the issues in a single piece of code immediately after the design.
  • Experienced with CONTROL M for automating the workload on a daily, weekly or monthly intervals.
  • Coordinated with the testing team and data owners to resolve the defects as and when they are logged by the testing team.
  • Providing production support and solving the production issues as and when they are noticed.
  • Coordinated with the offshore development team by delegating the tasks and guiding them in building the code and solving the issues.

Environment: Informatica Power Center 9.5, Oracle 11g/10g/9i, MS Access, TOAD 11.5, Rapid SQL, PL/SQL, UNIX, Mercury Quality Center, WindowsNT4.0

Confidential, Thousand Oaks, CA

Sr. Informatica Developer

Responsibilities:

  • Interacted with various business people in External Vendors side and gathered the business requirements and translated them into technical specifications.
  • Documented business requirements, discussed issues to be resolved and translated user input into ETL design documents.
  • Experienced working with team, lead developers, Interfaced with business analysts, coordinated with management and understand the end user experience.
  • Worked with data analysts to implement Informatica mappings and workflows, shell scripts and stored procedures to meet business requirements.
  • Involved in ETL process from development to testing and production environments.
  • Responsible for creating interfaces using different types of mappings using various transformations like Expression, filter, router, Aggregator, Look up, Joiner, Stored Procedure, Update Strategy, Etc.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters for the delta process to extract only the additional data added during that period.
  • Worked with data cleansing functions like ReplaceStr, ReplaceChr, Isnull, Is Spaces, Is Number, Etc. to clear the data of any foreign characters.
  • Extensively Used debugger in identifying bugs in existing mappings by analyzing data flow, evaluating transformations.
  • Worked with mappings to dynamically generate parameter files used by other mappings.
  • Involved in performance tuning of the ETL process by addressing various performance issues at the extraction and transformation stages.
  • Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
  • Documented the mappings used in ETL processes including the Unit testing and Technical document of the mappings for future reference.
  • Designed and Developed the Informatica workflows/sessions to extract, transform and load the data into Target.
  • Scheduled the Informatica Workflows to start running at specified date and time repetitively for ever.
  • Participated in unit testing to validate the data in the flat files that are generated by the ETL Process.
  • Logged on to the QC Centre to find out the defects posted by testing team and fixed those defects.
  • Used SQL Loader to load the flat file data into temp tables to compare the data with that generated from CMA.
  • Involved in creation of Schema objects like Indexes, Views, and Sequences.
  • Wrote SQL, PL/SQL codes and stored procedures for dropping and re- creating indexes, to generate oracle sequences, procedures that update the Process Run table automatically to make the delta process work accurately.
  • Used the resource MKS Toolkit to successfully execute the UNIX Shell scripts to perform pre/post session tasks.
  • Extensively worked with Unix Shell scripting to validate and verify the data in the flat files generated by the ETL process.
  • Developed post-session and pre-session shell scripts for tasks like merging flat files after loading, deleting temporary files, changing the file name to reflect the file generated date etc.
  • Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from and to different servers.
  • Developed the UNIX shell scripts to send out an E-mail on success of the process indicating the destination folder where the files are available.
  • Involved in migration of Informatica mapping from Development to Production environment.
  • Coordinated with the informatica administration team during deployments.
  • Provided the production support and solved the production issues immediately after they are noticed.

Environment: Informatica Power Center 8.6.1/7.1.4 , Oracle 11g/10g/9i, MS Access, TOAD 9.0, SQL Developer, PL/SQL, UNIX, Mercury Quality Center, WindowsNT4.0

Confidential

Informatica Developer /Lead ETL Test Analyst

Responsibilities:

  • Interacted with various business people in MVS and Facets side and gathered the business requirements and translated them into technical specifications.
  • Documented business requirements, discussed issues to be resolved and translated user input into ETL design documents.
  • Worked with data analysts to implement Informatica mappings and workflows, shell scripts and stored procedures to meet business requirements.
  • Experienced in creating XML Definition in power center repository when the source is a XML file.
  • Involved in ETL process from development to testing and production environments.
  • Guided the testing team (three testers) for CVP to perform end to end testing.
  • Developed test cases and mapped them to various business and user requirements.
  • Loaded the test cases in MQC and guided the team with execution of these test cases and logging of defects in MQC.
  • Involved in meetings as a test lead to discuss the test scenarios with the business team and got approval for system testing and UAT.
  • Extracted data from various sources like Oracle and flat files and loaded into the target Oracle database.
  • Created mapping using various transformations like Joiner, Filter, Aggregator, Lookup, Stored Procedures, Router, Sorter, Rank, Expression, Normalizer and Update Strategy.
  • Developed Informatica mappings to generate FVR (facets voucher record - Claims processed for current date) which contains EOB’s (Explanation of benefits), Notice of Payments (NOP’s) and Check are issued for the current voucher date.
  • Loaded TDS (tactical data store) with the derived voucher fields from MVS (Medical Vouchering System) from the reverse flow flat file.
  • Involved in performance tuning of the ETL process by addressing various performance issues at the extraction and transformation stages.
  • Extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings and sessions.
  • Created and Executed workflows and Worklets using Workflow Manager to load the data into the Target Database.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • Developed Session Parameter files for the workflows.
  • Extensively participated in System/Integration/Performance testing.
  • Analyzed the source fields and wrote SQL queries for field to field validation by referring source to target mapping document.
  • Developed test case’s for business and user requirements to identify claims for Institutional, Professional, Subscriber paid, etc. and wrote SQL queries to validate the data on the source and target databases according to source to target mapping document.
  • Involved in regular discussions with the Facets team to enter test data.
  • Provided test data as per the test data requirements provided by the Medical Vouchering System team.
  • Extensively used Mercury Quality Center to load test cases, execute them and log defects found in system testing.
  • Ran JCL’s regularly to kick-off jobs from main frames.
  • Extensively worked with Unix Shell scripting to validate and verify the data in the flat files sent to MVS. Used Korn Shell to do this.
  • Responsible for ETL process under development, test and production environments.
  • Handled Production issues and monitored Informatica workflows in production.

Environment: Informatica Power Center 8.5/8.1, Perl, Informatica Power Exchange 8.1.1, Oracle 11g/10g, db2, MS Access, Facets 4.51, TOAD 9.0, SQL Developer, db2, main frame JCL, PL/SQL, UNIX, Mercury Quality Center, WindowsNT4.0

Confidential

Senior Informatica Developer

Responsibilities:

  • Used power exchange to create a data map for the given dataset name from the mainframe environment.
  • Using informatica ODBC driver this data map, which represents the data, is imported into Informatica repository.
  • Interacted with various departments in the mainframe environment to know about the format of data coming from mainframe environment.
  • Used Informatica Designer to create complex mappings using different transformations to move data to Oracle database.
  • Developed mappings in Informatica to load the data from various sources into the Data Warehouse using different transformations like Source Qualifier, Expression, Filter, Router, Lookup, Aggregate, Rank, Update Strategy and Joiner.
  • Worked extensively with Normalizer transformation to normalize the data from the Mainframe environment.
  • Developed expressions to convert the dates in Gregorian and Julian formats to standard date format that suites the database.
  • Developed logic to denormalize the data from database to upload it back to mainframe.
  • Extracted (Oracle), Transformed and Loaded data into Flat files using Informatica mappings and complex transformations (Aggregator, Joiner, Lookup, Update Strategy, Source Qualifier, Filter, Router and Expression).
  • Used FTP connection browser available in Informatica Workflow manager to FTP the flat files into the Mainframe environment.
  • Designed and Developed the Informatica workflows/sessions to extract, transform and load the data into Target.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and sent e-mail using server manager.
  • Wrote SQL, PL/SQL codes, stored procedures and packages for dropping and re- creating indexes, to generate oracle sequences, procedures that cleans up the base automatically in case the mapping already ran, the procedure that will logically expire existing records and update specific Code, packages to incorporate business rules to some times eliminates transformation dependency.
  • Responsible for ETL process under development, test and production environments.
  • Handled Production issues and monitored Informatica workflows in production.

Environment: Informatica Power Center 8.5/8.1.1, 7.1.2 (Designer, Workflow Manager), Perl, Power Exchange 8.1.1, Oracle 11g/10g, SQL Navigator 5.0, Windows NT.

We'd love your feedback!