We provide IT Staff Augmentation Services!

Sr Informatica Developer Resume

2.00/5 (Submit Your Rating)

San Roman, CA

SUMMARY:

  • Informatica PowerCenter Certified Professional around 10 Years of experience in Information Technology with various Data warehousing tools.
  • 10 years of professional experience in various stages of Information Technology in analyzing the requirements, Design, Application Development, Testing, Deployment and Production Support in Data Warehousing Tools.
  • Extensively worked with Teradata versions V 2R5,12,14 and used utilities like BTEQ, Fast Export, Fast Load,TPT, Multi Load to export and load data to/from different source systems including flat files.
  • Proficient in performance analysis, partitioning, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Teradata as well as Oracle.
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, Teradata SQL Assistant and Query man.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Performed data profiling using Informatica Data Explorer (IDE) and Harvester for oracle, SQL server, MS Access, SFDC and Flat files Sources.
  • Experienced in Data cleansing tools like Informatica Data quality tool (IDQ) to handle the data anomalies and also standardizing data.
  • Experienced in Capuring REALTIME DATA(CDC) using INFORMATICA POWER EXCHANGE and have knowledge of unstructured data files and B2B data transformation
  • Three Plus (3+) Years of experience in Informatica Admin related activities
  • Extensive database experience using Oracle 11, 10g/9i/8i, SQL, PL/SQL, SQL*Plus, DB2 and SQL Server.
  • Extensively worked on with integration of various data sources like Oracle, Flat Files, Mainframe sources (COBOL), VSAM, XML files, ERP, WEB SERVICES, SQL Server, Sales force and DB2 into staging area.
  • Experienced with SQL and PL/SQL, Created Functions and procedures, Indexes, Synonyms, tables, Views
  • Created Pig and Hive scripts for applying ETL logic on the HDFS files and load into EDW
  • Worked on various Scheduling tools like Control - M, Tivoli, UC4, Cron Tab.
  • Ability to quickly grasp and apply cutting-edge ideas, methods and technologies. Effectively manage important projects and programs in fast-paced, time-critical environments.
  • Moved Semi structured data from HDFS file system to structured tables (Teradata)
  • Written the Apache PIG scripts to process the HDFS data.
  • Good knowledge on Agile Methodology and the scrum process.
  • Involved in review meetings with Project Managers, Developers and Business Associates for Project
  • Actively involved in Quality Processes and release management activities - To establish, monitor and streamline quality processes in the project.

TECHNICAL SKILLS:

ETL/Data Quality Tools: Informatica Power Center 9.x/ 8.x/7.x/6.x, Informatica PowerExchange, Informatica IDE,Informatica Data Quality(IDQ),Metadata Manager

BI Tools: BusinessObjectsXI/ R2/6.5(DesktopIntelligence,EBI,Desinger),Cognos,Informatica Data Analyzer/Power Analyzer

Data Profiling Tools: Informatica Data Explorer(IDE), Informatica Data Analyst, Harvester

RDBMS: Oracle 12c/ 11i/10g/9.x/8.x, Teradata 12/13/14, DB2, SQLServer 2005, MS- Access

Design / Application Tools: Erwin, Microsoft Visio (Visio Soft Diagram), TOAD, SQL Developer,Control-M, Tivoli,UC4

Languages/Scripting: SQL, PL/ SQL,UNIX ShellScripting

Operating Systems: UNIX/LINUX, MS-DOS, Windows NT/2000/XP

Methodologies: RalphKimball and Bill Inmon methodologies,Dimensional Modeling Concepts - Snow flake/Star Schema models.

PROFESSIONAL EXPERIENCE:

Confidential, San Roman, CA

Sr Informatica Developer

Environment: Informatica PowerCenter 9.6.1, Oracle 11g, Teradata 14,Tidal, Unix

Responsibilities:

  • Created design documents, ETL Source to Target Mapping documents (ETL Specifications), and worked on mappings using multiple transformations.
  • Participated in client meetings, discussions to understand the business and requirements .
  • Worked on Various transformations like Aggregator, Expression, Sequence Generator, Filter, Router, Union Look up and Update Strategy, Stored Procedure etc.
  • Worked on Informatica Data Quality(IDQ) transformations like Match, Standardizer, Merge, Exception, Address doctor, Key generator, label, parser and exported them to power center.
  • Worked on tuning informatica mappings and queries.
  • Worked on end to end execution of the project and created Prod implementation documents, Support manuals.
  • Created health checks for every hop like stage, data store, EDR, CTR and SAM data warehouse.
  • Loaded data into multiple targets like Oracle, Teradata, Flat files.

Confidential, San jose, CA

Informatica Consultant

Environment: Informatica PowerCenter 9.5.1, Oracle, Teradata 14, Micro Strategy, Java, Cassandra, Hadoop, UC4, Rally, UNIX

Responsibilities:

  • Worked on Business spikes for multiple user stories, interacted with business owners and users to get the better understanding about the system or specific application.
  • Created Data Model, Research, High and low level ETL/ Micro strategy Design confluence pages based on Business Spike.
  • Worked on Production issues/adhoc requests raised by the business users and resolved it in timely basis.
  • Worked with Java transformation and created the java code to parse the Jason string, to find the German characters in the data feed or xml’s and creating a signature file
  • Identified bottlenecks and worked on handling large volume data, tuning Informatica, Reporting queries, large volume history updates.
  • Loaded data into multiple targets like Oracle, Teradata, Flat files and XML files.
  • Worked on BTEQ’s, Fast export, Fast load to load data into Confidential EDW and BML Data warehouse
  • Created SET/MULTISET, Oracle tables as part of the design Process and followed with LDM/PDM approvals.
  • Created Micro strategy reports to generate Summary and Detail reports.
  • Demoed the research pages/ETL/Micro Strategy Reports to Business users in the end each sprint.
  • Created Unix scripts for file handling, ftp’d the encrypted and zip flat files, xml’s to various vendors
  • Worked as a Scrum master for agile methology, performed Code reviews and QA on peer’s code on rotation basis.
  • Worked on moving semi structure data from HDFS to Structured databases
  • Scheduled jobs using Control-M and UC4 and followed all the PP and BML process to move code to prod.
  • Written the Apache PIG scripts to process the HDFS data.
  • Developed the sqoop scripts in order to make the interaction between Pig and Teradata

Confidential, Newark, CA

Sr Informatica D eveloper

Environment: Informatica Power Center 9.5, Informatica IDQ, Microsoft SQL server 2012, Oracle 11g, MongoDB .

Responsibilities:

  • Import Functionality extracts data from the SQL server and loads into flat files which is further fed to export and then load into No SQL database MongoDB and SQL server Database.
  • Extensively worked with delimited flat files and tuned the Informatica process to run faster as Import is part of the application and its run time is very important for the insurance companies to generate losses.
  • Worked on Error handling techniques and used splunk to upload the error handling files.
  • Created ETL Source to Target Mapping document (ETL Specifications), ETL design Documents
  • Worked on Various transformations like Aggregator, Expression, Sequence Generator, Filter, Router, Union Look up and Update Strategy, Stored Procedure etc.
  • Worked on Informatica Data Quality(IDQ) transformations like Match, Standardizer, Merge, Exception,
  • Address Validator, Key generator, label, parser and exported them to power center.
  • Involved in deploying informatica code to multiple environments on cloud.
  • Created Informatica Web service Hub, Grid and nodes worked on some of the Informatica Admin activities like Releasing locks, folder creation, assigning access to the users, Repository Backup’s .
  • Worked with multiple RMSOne process teams to make sure all the processes run and generate the lossy data.

Confidential, San Francisco, CA

ETL Developer

Environment: Informatica Power Center 8.6.1, Teradata V12, SQL server, Oracle 10g, Teradata Assistant, Micro strategy,Tableau

Responsibilities:

  • Created ETL Source to Target Mapping document (ETL Specifications), and worked on mappings using multiple transformations.
  • Participated in client meetings, discussions to understand the business and requirements .
  • Worked on Teradata Aggregate Materialize views to help micro strategy reports to pull the data faster.
  • Created SQL Server scripts to perform ETL and load into Stage and Datamart layer.
  • Supported the business users in UAT phase by creating complex scripts and present them the high level numbers which they can forward to the higher management.
  • Used Teradata utilities fastload, multiload to load data
  • Wrote BTEQ scripts to transform data and Fastexport scripts to export data
  • Wrote, tested and implemented Teradata Fastload, Multiload and Bteq scripts, DML and DDL.

Kaiser Permanente, Pleasanton, CA

Informatica Developer

Environment: Informatica Power Center 8.6.1, Oracle 10g/9i,UNIX, Toad, SQL Developer,SQL Navigator,Tivoli

Responsibilities:

  • Participated in client meetings, white board discussions in order to finalize the requirements .
  • Involved in creating complex mappings, mapplets, and worked on different power center transformations like Aggregator, expression, joiner, union, update strategy, lookup, Normalizer, SQL
  • Involved in both at Informatica and database level tuning (parallelism, row migration, PCT parameters, Re-org tables)
  • Involved in migrating code on daily basis and promptly worked on the production support issues.
  • Scheduled the workflows using Tivoli.
  • Created ETL Design, Source to Target and came up with the ETL design. Operational documents to hand over the code to production support team
  • Worked on production support issues based on the priority and resolved them in timely manner.

Confidential, San Diego, CA

Informatica Lead Developer

Environment: Informatica PowerCenter 8.6.1, Oracle 10g/9i,Teradata, Cognos, UNIX, Toad, SQL Developer

Responsibilities:

  • Participated in client meetings, white board discussions in order to finalize the requirements .
  • Worked with multiple sources like Oracle, web services, REST WEB SERVICES and SFDC.
  • Worked on multiple power center transformations like HTTP, XML Parser, Aggregator, Lookup, Joiner, Update strategy, Expression, union.
  • Worked on partitioning the table’s and worked on table’s parallelism in order to increase the performance.
  • Experienced in handling large volume data.
  • Worked in deployment of code from Development to TEST and TEST to Production servers.
  • Worked with up streams and downstream teams in order to created whole control-M job schedules and applied all the dependencies and scheduled them for daily and monthly basis.
  • Co-coordinated with offshore to make them understand the system better and get the quality deliverables.
  • Involved in testing of cognos reports and extranet
  • Involved in creating Design, implementation, operational documents

We'd love your feedback!