We provide IT Staff Augmentation Services!

Sr. Informatica Developer/production Support Engineer Resume

Somerset, NJ

SUMMARY:

  • EIGHT years of IT experience as an ETL Developer/Program Analyst, which includes Data Warehouse/Database Developer experience using Informatica Cloud, Informatica PowerCenter 10.1.0/9.6.1/8. x, Power Exchange 9.5.1/8.6/8.1, Informatica Data Quality and Alteryx.
  • Over SEVEN years of Database experience on Oracle using AWS RDS, Teradata, MS SQL Server, A, MS access, PostgreSQL, Mongo DB, Hadoop.
  • Experience on AWS cloud services like EC2, S3, RDS, Cloud watch and IAM for troubleshooting on various performance issue, Server maintenance, and cost reduction analysis.
  • Experience in MAP reduce, Hive, Pig, Impala, Sqoop, Oozie and Spark.
  • Experience in all the phases of Data warehouse life cycle involving Requirement Analysis, Design, Coding, Testing, and Deployment.
  • Extensively worked on the ETL mappings, analysis and documentation of OLAP reports requirements. Solid understanding of OLAP concepts and challenges, especially with large data sets.
  • Strong knowledge of Entity - Relationship concept, Facts and dimensions tables, slowly changing dimensions and Dimensional Modeling (Star Schema and Snow Flake Schema).
  • Experience in integration of various data sources like Oracle, DB2, MS SQL Server, PostgreSQL, Mongo DB, MS access, Datalake and flat files into staging area and DWH DB.
  • Experienced in using IDQ tool for profiling, applying rules and develop mappings to move data from source to target systems.
  • Experience in creating Reusable components like transformation, mapplets and tasks.
  • Experience in writing, testing and implementation of the PL/SQL triggers, stored procedures, functions, packages.
  • Extensive experience in writing UNIX shell scripts and automation of the ETL processes using UNIX shell scripting.
  • Experience in using Automation Scheduling tools like Tidal, Control-M and Autosys .
  • Proficient in interaction with the business users by conducting meetings with the clients in Requirements Analysis phase.
  • Assign work and provide technical oversight to onshore and offshore developers.
  • Experience in Scrum, Agile Methodology and tools (JIRA, Confluence & LeankIT).
  • Experience in coordinating with Business Analysts, QA team, DBA team and End Users at various stages of Software life cycle.
  • Experience in providing end to end solution and on time delivery of project including implementation and support in production.

TECHNICAL SKILLS:

ETL Tools: Informatica 10.x/9.x/8.x/7.x/6.x (Power Center/Developer/Informatica cloud), Alteryx

Database: Oracle 11g /10g/9i/8i/7.x, SQL Server 2005/2000, Mongo DB, PostgreSQL, Teradata 14/13/12/V2R6

BI Tools/Analytics: OBIEE 11.1.1.x, 10.1.3.x. Tableau Desktop, Tableau Server Versions 9.2, 9.0, 8.1, 8.0

Scheduling Tools: Tidal, Autosys, Control M

Data Modelling Tool: Erwin

Methodologies: Data Modeling, Logical/Physical, Star/Snowflake Schema, FACT &Dimension TablesETL, OLTP/OLAP, Software Development Life Cycle (SDLC)

Languages: SQL, PL/SQL, HTML, Scala

Operating System: Win 95/98/2000/XP, Win NT 4.0, UNIX/Linux

Oracle Tools/Utilities: SQL*Loader, Oracle Designer, SQL Developer, TOAD, SQL Navigator, JDeveloperOracle Developer 2000

Scripting Languages: VBScript and UNIX shell script

Big Data: Hadoop, Hive, Scoop, Spark, Impala, Oozie

PROFESSIONAL EXPERIENCE:

Confidential, Somerset, NJ

Sr. Informatica Developer/Production Support Engineer

Responsibilities:

  • Worked on various phases of SDLC from requirement gathering, analysis, design, development, testing and production migration.
  • Worked on Informatica Power Center tools - Designer, Repository Manager, Workflow Manager and Workflow Monitor. Experience translating business problems into actionable data quality initiatives.
  • Worked with Informatica Data Quality toolkit, Analysis, data cleansing, data matching, data conversion, exception handling, and reporting and monitoring capabilities of IDQ.
  • Extensively used all the Transformations like source qualifier, aggregator, filter, joiner, Update Strategy, Unconnected and connected Lookups, Router, Sequence Generator etc.
  • Extracted data from various source systems like Oracle, Teradata, SQL Server and flat files as per the requirements. Designing the ETL process flows to load the data into Oracle Database from Heterogeneous sources.
  • Worked on the EDL data ingestion using the Spark & Sqoop and also automated the ingestion process using the Ozzie.
  • Worked on the AWS RDS, EC2 & S3 maintenance, and cost saving using different approach.
  • Involved on the production job migration from the Tidal to Control-M.
  • Worked on the Informatica Cloud code migration to Informatica PC & Developer tool.
  • Developed the code to maintain the DB security by enabling the auditing and DB object utilization by users and applications.
  • Used Alteryx ETL and analytical platform using either preexisting modules or customizing applications to the current task and business rules.
  • Developed post-session and pre-session shell scripts for tasks like merging flat files after Creating, deleting temporary files, changing the file name to reflect the file generated date etc. Optimized query performance, session performance.
  • Designed and developed UNIX shell scripts as part of the ETL process to compare control totals, automate the process of loading, pulling and pushing data from and to different Servers. Involved in Structural and Functional Testing and Migrating the Code to Production.
  • Worked in complete agile methodology.
  • Onsite - Offshore team coordination.

Environment: Informatica PowerCenter 10/9.6/Cloud/BDE, Informatica Data Quality 9.6, Teradata 14, Hadoop, Hive 2.1, Sqoop, Spark, Impala, Ozzie, Tableau(Desktop/Server), AWS-RDS, EC2 & S3, SQL Server 2008, HPALM, JIRA, LeankIT, AtlassianConfluence, Stash, Sharepoint, Oracle SQL Developer, Linux Server, MBOX, AWS VPCx, Microsoft Suite (Excel, Word, and Visio), Control-M, Tidal

Confidential, Somerville, NJ

Sr. Informatica Developer

Responsibilities:

  • Working jointly with business analysts and product owners to solicit visualization requirements, recommending changes, creating prototypes, giving demos to the users.
  • Capturing end user stories on JIRA agile tool, analyzing and laying out a design in continuous integration environment and grooming the end user stories with business users, business analysts and delivering the user stories every sprint.
  • Worked on Informatica cloud and Informatica Power Center tools- Designer, Repository Manager, Workflow Manager, and Workflow Monitor.
  • Parsed high-level design specification to simple ETL coding and mapping standards.
  • Designed and customized data models for Data warehouse supporting data from multiple sources like Oracle(JIRA), Teradata, Cloudera, Bigdata, Flat File on real time.
  • Developed mapping parameters and variables to support SQL override and dynamic parameter.
  • Worked extensively on TeraData utilities like MLOAD, FLOAD, FExtract and TPT.
  • Worked on Teradata query, Index, macros, partitions, and procedure.Worked on explain plan and tuned queries.
  • Preparing production migration documents for each Confidential release and work with production team during migrations. Preparing functional documents, technical designs solutions and business rules on Atlassian confluence in each sprint.
  • Worked on multiple projects using Informatica developer tool IDQ for data cleansing, validation and formating.
  • Involved in migrating code from Oracle, Teradata and SQL-Server to HDFS and worked extensively in Hive.
  • Worked on Informatica BDE to migrate code from PowerCenter to Hadoop Hadoop cluster and integrated many systems.
  • Involved in migration of the mapps from IDQ to power center and vice-versa.
  • Logical and Physical data modeling was done using Erwin for data warehouse database in STAR SCHEMA.
  • Creating Informatica mappings, Tables in database and run the mappings through shell scripts to perform ETL from source to target.
  • Aggregating data, calculating metrics to report across the dashboards in the database to improve the performance.
  • Creating parameters and integrating with calculated fields in tableau dashboards to report at different granularity of the data.
  • Building Ad-hoc reports, publishing data sources, dashboards, embedding dashboards for end users and creating refresh extracts on tableau Server.

Environment: Informatica PowerCenter 10/9.6/Cloud/BDE, Informatica Data Quality 9.6, Teradata 14, Hadoop, Hive 2.1, Tableau(Desktop/Server), AWS-RDS, EC2 & S3, SQL Server 2008, HPALM, JIRA, LeankIT, Atlassian Confluence, Stash, Sharepoint, Oracle SQL Developer, Linux Server, MBOX, AWS VPCx, Microsoft Suite (Excel, Word, and Visio)

Confidential, Minneapolis, MN

Lead Informatica Developer

Responsibilities:

  • Converted business requirements into technical documents - BRD, explained business requirements in terms of technology to the developers.
  • Developed Data Flow diagrams to create Mappings and Test plans. The Data flow diagrams ranged from OLTP systems to staging to Data warehouse.
  • Developed Test plan to verify the logic of every Mapping in a Session. The test plans included counts verification, look up hits, transformation of each element of data, filters, and aggregation and target counts.
  • Developed complex Informatica mappings using various transformations- Source Qualifier, Normalizer, Filter, Connected Lookup, Unconnected Lookup, Update strategy, Router, Aggregator, Sequence Generator, Reusable sequence generator transformation.
  • Extensively used SCD’s (Slowly Changing Dimension) to handle the Incremental Loading for Dimension tables, Fact tables.
  • Designed various mappings for extracting data from various sources involving Flat files, Oracle, Sybase and SQL Server, IBM DB2.
  • Created power exchange registration, data map and restart token for CDC real time process.
  • Worked on Debugging and Troubleshooting of the Informatica application. For debugging utilized Informatica debugger.
  • Worked on multiple Teradata load utilities like Fast Extract, FLOAD, MLOAD and TPT to load data to stage and olap.
  • Worked on Informatica Data Quality (IDQ) for various Data Profiling like column based, joined based and enterprise based.
  • Extensively worked on Standardizer, data cleansing, finding duplicates, exception handling and error clean up.
  • Worked on IDQ rules, mapplets, mappings, scorecard,, data services and applications.
  • Scheduled the workflow using CRONTAB command and PMCMD in UNIX.
  • Worked with production support systems that required immediate support.
  • Worked extensively on Real-time Change Data Capture (CDC) using Informatica Power Exchange to load data from DB2 mainframe and VSAM file.
  • Worked on Performance Tuning to optimize the Session performance by utilizing, Partitioning, Push down optimization, pre and post stored procedures to drop and build constraints.
  • Experience in building of the Repository (RPD) using the OBIEE toolset.
  • Developed OBIEE technology such as Answers, Dashboards, and BI Publisher.
  • Identify users and see what level of familiarity do they have with this data. Segment these users based on criteria like influence, authority and participation levels.
  • Identifying automated decision-making possibilities.
  • Worked on Teradata utilities BTEQ, MLOAD, FLAOD and TPUMP to load staging area.
  • Worked on Audit login, Error login and check while loading data to Data warehouse.
  • Created UNIX Script for ETL jobs, session log cleanup and dynamic parameter.

Environment: Informatica PowerCenter 9.6/Cloud/BDE, Informatica Data Quality 9.6, AS-400(DB2), Oracle 11g, Teradata 14, Hadoop, Hive, OBIEE 10g, SQL Server 2005/2K, AIX, Windows XP, Sun Solaris, Putty, SCM, Rally

Confidential, NYC

Sr. Informatica Developer

Responsibilities:

  • Participated in the business analysis process and the development of ETL requirements specifications.
  • Involved in the design, development and implementation of the Enterprise Data Warehousing (EDW) process and Data Mart.
  • Designed mapping document, which is a guideline for ETL coding following standards for naming conventions and best practices were followed in mapping development.
  • Extracted the data from the Flat files, Excel files, Operational systems, Oracle databases into staging area and populated onto data warehouse.
  • Developed number of complex Informatica mappings, mapplets, and reusable transformations to implement the business logic and load the data incrementally.
  • Worked with Connected, Unconnected Lookups, Aggregator, Router, Update strategy, Sorter, Normalizer, SQL and SQ transformations.
  • Designed CDC Real-time Informatica recovery process,
  • Created job for start and stop real-time workflows, wrapper to generate restart token for failed workflows.
  • Created static and dynamic deployment groups to move code to higher region.
  • Extracted data from Flat files, SQL Server and Oracle and loaded them into Teradata.
  • Involved in Data Profiling &Data Analysis on Database sources like Oracle and flat files.
  • Used Debugger extensively to validate the mappings and gained troubleshooting information about data and error conditions.
  • Used Teradata for running the SQL Queries to check the data loaded to the Target Tables.
  • Worked with Teradata loading utilities like Multi Load, Fast Load and TPump.
  • Involved in creation of BTEQ Script for validating the data quality like referential integrity, not null check.
  • Utilized UNIX shell scripts for adding the header to the Flat file targets.
  • Worked on multiple Teradata load utilities like FExtract, FLOAD and MLOAD to load data to stage and olap.
  • Worked on Informatica Data Quality (IDQ) for Data Profiling, Standardizer, data cleansing and error clean up.
  • Worked on IDQ rules, mapplets, mappings, scorecard,, data services and applications.
  • Scheduled the workflow using CRONTAB command and PMCMD in UNIX.
  • Worked with production support systems that required immediate support.
  • Worked extensively on Real-time Change Data Capture (CDC) using Informatica Power Exchange to load data from DB2 mainframe and VSAM file.
  • Created Power Exchange Registration, Datamap, Restart Token. Configured recovery and hanging workflows job. Configured real-time job and created job to check health of all running jobs.
  • Participated in weekly meetings to discuss the status, issues and defects detected during the different stages of testing of the application.
  • Tested all the mappings and sessions in Development, UAT environments and also migrated into Production environment after everything went successful.

Environment: Informatica Power Center 9.1/9.6, Power Exchange 9.1/9.6, Informatica Data Quality 9.1/9.6, DB2 UDB, Oracle11g, Teradata 12/13, TOAD, Autosys, SQL* Loader, QVCS, Flat file, Sun Solaris UNIX, BTEQ, Windows Enterprise 2000

Confidential, Parsippany, NJ

Informatica Developer

Responsibilities:

  • Designed ETL processes using Informatica to load data from Flat Files, and Excel files to target Oracle Data Warehouse database.
  • Prepare ETL Specifications based on the design and high level logical flow provided by BI Architect.
  • Develop code using Oracle SQL Loader / Informatica mappings, workflows and sessions as per ETL specifications and implemented CDC (Change Data Capture) scripts wherever required.
  • Create Performance Metrics and Deployment documents.
  • Perform Unit, Functional, Integration and Performance testing there by ensuring data quality.
  • Create job chains to schedule data loads and trace bugs, close HII-logs if anyCo-ordinate with support teams to deploy code in stage and production environments.
  • Provide transition to RTS team after project go-live.
  • Used tuned SQL overrides in source Qualifier to meet business requirements.
  • Written pre session and post session scripts in mappings. Created sessions and workflows for designed mappings. Redesigned some of the existing mappings in the system to meet new functionality.
  • Created and used different reusable tasks like command and email tasks for session status.
  • Used Workflow Manager to create Sessions and scheduled them to run at specified time with required frequency.
  • Monitored and configured the sessions that are running, scheduled, completed and failed.
  • Created test case, participate in System integration test, UAT and migrated code from Dev to Test to QA to Prod.
  • Compiled report presentations using tools like Business Object reports.
  • Involved in writing UNIX shell scripts for Informatica ETL tool to fire off services and sessions.

Environment: Informatica 8.5, Oracle 11g, Flat files, Excel, SQL Server 2008, PL/SQL, TOAD, Linux, UNIX Shell Scripting, Windows XP, Business Object XI R2, Perl

Hire Now