We provide IT Staff Augmentation Services!

Sr. Informatica Developer/data Analyst Resume

2.00/5 (Submit Your Rating)

DallaS

SUMMARY:

  • 8+ years of IT experience in Data Warehousing technology and all phases of Software Development Life Cycle (SDLC) including Business Requirement Analysis, Application designing, Development, Implementations and Testing of Data warehousing and Database business systems for Retail, Healthcare, Financial, Insurance domains.
  • 6 years of experience in designing and development of ETL Methodology using Informatica PowerCenter 10.0, 9.6.1/9.0.1/8. x, Informatica PowerExchange 10.0/9.6.1/9.0.1, InformaticaData Quality 10.0/9.6.1.
  • 7+ years of experience in working with ORACLE 11g/10g, PL/SQL, MySQL, SQL Server, Netteza databases.
  • 3+ years of experience inData Quality, Profiling, Validations, reference check and exception handling using Informatica Data Quality.
  • 1+ years of working experience in Bigdata technologies andintegration usingHDFS and Hive.
  • 1+ years of working experience in AWS cloud environment using ASW s3 bucket, Snowflake, RDS(MySQL)
  • 3+ years of experience in Change Data Capture (CDC) Methodology using PowerExchange10.0/9. x
  • 3+ years of experience in Teradata 14/13 SQL and utilities like MLOAD, Fast Load, TPT.
  • Interacted with end - users and functional analysts to identify and develop Business Requirement Document (BRD) and transform it into technical requirements.
  • Worked extensively in design and analysis of Requirements, Development, Testing and Production support.
  • Very strong knowledge of Informatica Data Quality transformations like Address validator, Parser, Labeler, Match, Exception, Association, Standardizer and other significant transformations.
  • Moderate experience in working with Information Technology Infrastructure Library's
  • Prepared various types of documents like requirements gathering, ETL specification, Data Mapping, Test cases, Data dictionary etc.
  • Expert in writing optimize SQL query using ORACLE, SQL Server, Teradata, MySQL and Hive.
  • Exposure in writing SQL using analytical functions like Ranking Functions, Reporting Aggregate Functions, LAG/LEAD Functions, FIRST/LAST Functions etc.
  • Extensive knowledge of Informatica tuning and SQL/PL SQL tuning.
  • Experience in integration of various data sources like Oracle, SQL server and sequential files into staging area.
  • Very strong knowledge on end to end process of Data Quality and MDM requirements and its implementation.
  • Experience in AWS (Amazon Web Services), S3 Bucket and Redshift (AWS Relational Database).
  • Performed Unit testing, System testing, Integration testing and users for UAT and prepared test report in different phases of project.
  • Created and modified UNIX Shell Scripts for ETL jobs.
  • Extensive experience with Data Extraction, Transformation, and Loading (ETL) from Multiple Sources.
  • Worked on data integration from various source files such as flat files, CSV files, Relational Databases etc. into a common analytical Data Model.
  • Experience in Informatica Production support; resolved hundreds of tickets where Data issues involved.
  • Designed and developedcomplex mappings, Mapplets, tasks and workflows and tuned it.
  • Experience in Debugging and Performance tuning of targets, sources, mappings and sessions in Informatica.
  • Moderate experience in Informatica Change Data Capture Management
  • Experience in optimizing the Mappings and implementing the complex business rules by creating re-usable transformations, Mapplets and PL/SQL stored procedures.
  • Extensively used Slowly Changing Dimension (SCD) technique in business application.
  • Conducted Unit tests, Integration tests and Customer Acceptance tests (UAT).
  • Experience in working with MicroStrategy developed Dashboard and Scorecard Reports.
  • Expertise in OLTP/OLAP Analysis, E-R modeling and dimension modeling.
  • Developing Database Schemas like Star schema and Snowflake schema used in relational, dimensional and multidimensional modeling.
  • Have understanding and Knowledge about Information Technology Infrastructure Library(ITIL)
  • Working experience in Agile and Waterfall methodologies.
  • Excellent communication skills, ability to communicate effectively with executive and management team, having strong analytical, problem solving skills.

TECHNICAL SKILLS:

ETL TECHNOLOGY: Informatica PowerCenter 10.0, 9.x, 8.x, Informatica PowerExchange 10.0/9.x, IDQ 10.0/9.x, Exterprise Information Catalog (EIC), Intelligent Data Lake(IDL)

DATA WAREHOUSE: Star Schema, Snowflake schema and Multidimensional Modeling, Development

DATA MODELLING: MS Visio, Erwin 8/7.1

DATABASES: Oracle 11g/10g, MS SQL Server 2012/7.0/2000, MS Access, Sybase, DB2, MySQL, Teradata 14/13, Netezza

PROGRAMMING: C, C++, SQL, PL/SQL, HTML, CSS, UNIX Scripting, Github

TOOLS: Quest TOAD, SQL*PLUS, SQL*Loader, SQL*Net, SQL Navigator Export/Import, Oracle Discoverer 10gOPERATING SYSTEMS: Windows 98/NT/2000/XP, AIX, Sun Solaris, UNIX, MS-DOS

APPLICATIONS: MS Office, MS Project, FrontPage, Toad 9.2/8.6, Basecamp, Rally

MANAGEMENT TOOLS: MS-Office 2007, MS Project

BIG DATA / AWS: Hadoop, Scoop, Hive, s3 bucket, Snowflake

BI REPORTING: MicroStrategy 10.x,9.x, dashboard development

PROFESSIONAL EXPERIENCE:

Confidential, Dallas

Sr. Informatica Developer/Data Analyst

Responsibility :

  • Worked with business analyst for requirement gathering, business analysis and testing and project- coordination using interviews, document analysis, business process descriptions, scenarios and workflow analysis.
  • Created Technical Design Document or Minor Release Document (MRD) from business requirements document (BRD) or Functional Requirement Document (FRD) business analyst based on business objectives, facilitated joint sessions.
  • Analyzed business and system requirements to identify system impacts.
  • Created flow diagrams and charts.
  • Validated, Standardized and cleaned data as per the business rules using IDQ.
  • Initiated the process of Data Profiling by Profiling different formats of data from different sources and users.
  • Created the Detail Technical Design Documents which have the ETL technical specifications for the given functionality, overall process flow for each particular process, Flow diagrams, Mapping spreadsheets, issues, assumptions, configurations, Informatica code details, Database changes, shell scripts etc. and conducted meetings with the business analysis, clients for the Approval of the process.
  • Analyzed the existing mapping logic to determine the reusability of the code.
  • Handled versioning and dependencies in Informatica.
  • Created Mapping Parameters, Session parameters, Mapping Variables and Session Variables.
  • Created mapping to stores data in s3 bucket, used AWS command line command for multiple operation like move, download etc.
  • Used AWS infrastructure to load s3 bucket data into Snowflake, which was accessed by business users for adhoc query.
  • Translated the PL/SQL logic into Informatica mappings including Database packages, stored procedures and views.
  • Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system. This led to better session performance.
  • Created and maintained the Shell Scripts and Parameter files in UNIX for the proper execution of Informatica workflows in different environments.
  • Worked Extensively with version control software(Github)
  • Actively Participated in Data Quality services and frameworks for Data Quality
  • Created UNIX scripts to read/write and ftp files from and to windows servers and UNIX.
  • Created Unit test plans and did unit testing using different scenarios separately for every process. Involved in System test, Regression test and supported the UAT for the client.
  • Performing ETL and database code migrations across environments using deployment groups.
  • Populating the business rules using mappings into the target tables.
  • Developed Informatica Data Quality Mappings, sessions, workflows, scripts and orchestration Schedules.
  • Involved in end-to-end system testing, performance and regression testing and data validations.
  • Worked extensively on modifying and updating existing oracle code including object types, views, PL/SQL stored procedures and packages, functions and triggers based on business requirements.
  • Worked in agile minor release cycles the designated database developer.
  • Unit test and support QA and UAT testing for database changes.
  • Managed performance and tuning of SQL queries and fixed the slow running queries in production.
  • Help support Data masking projects for DRD for all Dev, QA and UAT environments via Enterprise Data Obfuscation.
  • Build creation, verification and deployment to QA and UAT using Transporter.
  • Created batch scripts for automated database build deployment.

Environment: Informatica PowerCenter 10.0, Informatica Data Quality 10.0, Oracle 11g, PL/SQL Developer, MySQL, RDS MySQL, Teradata 14, AWS s3 Bucket, Putty, BitBucket, Rally, JIRA, Snowflake DB, Transporter, Service NOW (SNOW), Enterprise Data Obfuscation(EDO), Control-M 10

Confidential, Miami, FL

Sr. Informatica Developer/Data Analyst

Responsibilities :

  • Analyze business requirements, technical specification, source repositories and physical data models for ETL mapping and process flow.
  • Worked extensively with mappings using expressions, aggregators, filters, lookup, joiners, update strategy and stored procedure transformations.
  • Extensively used Pre-SQL and Post-SQL scripts for loading the data into the targets as per requirement.
  • Developed mapping to load Fact and Dimension tables, for type 1 and type 2 dimensions and incremental loading and unit tested the mappings.
  • Extracted data from a web service source, transform data using a web service, and load data into a web service target.
  • Experience in real time Web Services which performs a lookup operation using key column as input and provided response with multiple rows of data belonging to key.
  • Used Web Service Provider Writer to send Flat file target as attachments and also used for sending email from within a mapping.
  • Imported Hive table using PowerExchange, used PowerExchange for Hadoop accesses Hadoop to extract data from HDFS or load data to HDFS/Hive.
  • Coordinate and develop all documents related to ETL design and development.
  • Involved in designing the Data Mart models with ERwin using Star schema methodology.
  • Used repository manager to create repository, user’s groups and managed users by setting up privileges and profile.
  • Used debugger to debug the mapping and correct them.
  • Performed Database tasks such as creating database objects (tables, views, procedures, functions).
  • Responsible for debugging and performance tuning of targets, sources, mappings and sessions.
  • Optimized the mappings and implementing the complex business rules by creating re-usable transformations and Mapplets.
  • Involved in writing BTEQ, MLOAD and TPUMP scripts to load the data into Teradata tables.
  • Optimized the source queries in order to control the temp space and added delay intervals depending upon the business requirement for performance
  • Used Informatica workflow manager for creating, running the Batches and Sessions and scheduling them to run at specified time.
  • Executed sessions, sequential and concurrent batches for proper execution of mappings and set up email delivery after execution.
  • Preparing Functional Specifications, System Architecture/Design, Implementation Strategy, Test Plan & Test Cases.
  • Implemented and documented all the best practices used for the data warehouse.
  • Improving the performance of the ETL by indexing and caching.
  • Created Workflows, tasks, database connections, FTP connections using workflow manager.
  • Responsible for identifying bugs in existing mappings by analyzing data flow, evaluating transformations and fixing bugs.
  • Code walks through with team members.
  • Developed stored procedures using PL/SQL and driving scripts using Unix Shell Scripts.
  • Created UNIX shell scripting for automation of ETL processes.
  • Used UNIX for check in’s and check outs of workflows and config files in to the Clearcase.
  • Automated ETL workflows using Control-M Scheduler.
  • Involved in production deployment and later moved into warranty support until transition to production support team.
  • Experience in monitoring and reporting issues for the Daily, weekly and Monthly processes. Also, work on resolving issues on priority basis and report it to management.

Environment: Informatica PowerCenter 9.6.1, IDQ 9.6.1, Oracle 11g, Teradata 14.1.0, WebService, Hadoop, Hive, Teradata SQL Assistant, MSSQL Server 2012, DB2, Erwin 9.2, DAC Scheduler, Putty, Shell Scripting, Clearcase, Putty, WinSCP, Notepad++, JIRA, Control-M V8, Cognos 10.x

Confidential, Minneapolis

Sr. Informatica Developer/Data Analyst

Responsibilities:

  • Involved in full life cycle development including Design, ETL strategy, troubleshooting Reporting, and Identifying facts and dimensions.
  • Reviewing the requirements with business, doing regular follow ups and obtaining sign offs.
  • Worked on different tasks in Workflows like sessions, events raise, event wait, decision, e-mail, command, worklets, Assignment, Timer and scheduling of the workflow.
  • Created sessions, configured workflows to extract data from various sources, transformed data, and loading into data warehouse.
  • Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions (SCD) type 2 and type 1.
  • Used Debugger to test the mappings and fixed the bugs.
  • Used various transformations like Filter, Expression, Sequence Generator, Source Qualifier, Lookup, Router, Rank, Update Strategy, Joiner, Stored Procedure and Union to develop robust mappings in the Informatica Designer.
  • Done analysis of Source, Requirements, existing OLTP system and identification of required dimensions and facts from the Database.
  • Tuning Informatica Mappings and Sessions for optimum performance.
  • Developed various mapping by using reusable transformations.
  • Prepared the required application design documents based on functionality required.
  • Designed the ETL processes using Informatica to load data from Oracle, Flat Files (Fixed Width and Delimited) to staging database and from staging to the target Warehouse database.
  • Worked on database connections, SQL joins, cardinalities, loops, aliases, views, aggregate conditions, parsing of objects and hierarchies.
  • Responsible for monitoring all the sessions that are running, scheduled, completed and failed. If the session fails debug the Mapping.
  • Involved in testing Unit and integration Testing of Informatica Sessions, Batches, fixing invalid Mappings
  • Defined the program specifications for the data migration programs, as well as the necessary test plans used to ensure the successful execution of the data loading processes.
  • Developed and executed scripts to schedule loads, for calling Informatica workflows using PMCMD command.
  • Worked on Dimensional Data Modeling using Data modeling tool Erwin.
  • Populated Data Marts and did System Testing of the Application.
  • Built the Informatica workflows to load table as part of data load.
  • Wrote Queries, Procedures and functions that are used as part of different application modules.
  • Implemented the best practices for the creation of mappings, sessions and workflows and performance optimization.
  • Created Informatica Technical and mapping specification documents as company standards.

Environment: Oracle 11g/9i, MySQL, Informatica9.x,Putty,UNIX Shell Scripting, Windows XP, Tableau, MS VISIO, MS Office, Control-M

Confidential, Boston,MA

ETL Informatica Developer/ Data Analyst

Responsibilities:

  • Analysis of business requirements with the help of BRD and HLD documents and by attending design review meetings.
  • Analysis of external/internal data sources and various systems, creation of data dictionary of various tables within the Data Mart.
  • Actively participated in interactions with business users, technical architects, DBAs and technical manager to fully understand the requirements of the system.
  • Participated in the development of a data quality framework including scorecard and dashboard.
  • Analyzed and reviewed Stored procedures, user defined functions, views, SQL scripting according to the complex business logics used in the reporting and cross verify the results with business users.
  • Extensively used joins and sub-queries for complex queries involving multiple tables from different databases and optimized the database by creating various clustered, non-clustered indexes and indexed views.
  • Wrote several SQL Scripts such as finding tables that have identity columns, finding tables that do not have primary keys etc.
  • Reviewed drill through report and data blending in case of merging different sources in Tableau.
  • Performance Tuning in SQL Server 2008 using SQL Profiler and Data Loading.
  • Responsible with data analysis tools for data lineage, metadata, and data profiling.
  • Created indexes on selective columns to speed up queries and analyzed in Database and reduced overhead successfully by avoiding unnecessary use of the UNION statement & using the top operator to limit the select statement in Queries.
  • Perform data cleansing and data masking while creating table views for business users maintaining privacy guidelines.
  • Created views for restricting access to data in OLTP tables for maintaining the security guidelines.
  • Responsible in creating and presenting informational reports for management based on SQL data.
  • Participated in tasks to make changes to the SQL database that would prevent data corruption.
  • Raise defects using HP ALM Quality Center on the issues faced and tracking the defects.
  • Attend the daily team meetings on project status and delivery status.

Environment: Oracle 11g R2, UNIX, SQL Server 2008, Informatica 8.1, MS Office, Cognos,MS Access, MS Visio, MS Excel, XML,Windows XP

We'd love your feedback!