We provide IT Staff Augmentation Services!

Informatica Developer / Data Analyst Resume

2.00/5 (Submit Your Rating)

NC

SUMMARY

  • 6+ years of experience in Requirement Analysis, Functional/Technical Analysis, Design, Development, implementation, Testing, Debugging, Productions Support, and Maintenance of various Data warehousing Applications.
  • Overall 6+ years of IT experience which includes Data Integration and Analysis for Decision Support System in Insurance, Financial environment.
  • Working closely with architects for requirement gathering and working with teams on the solution.
  • Have good analyzing skills to analyze existing system and also planned system to discover a feasibility of systems
  • Data Warehousing experience using Informatica Power Center 9.6/9.5/9.1/8.6 , Repository Admin console, Repository Manager, Designer, Workflow Manager, Workflow Monitor, ETL, Data mart, OLAP, OLTP.
  • Experience with Database Systems like Oracle, MS SQL Server, Teradata and DB2.
  • Excellent experience in extraction, transformation and loading the data from various source databases to target database.
  • Implemented Slowly Changing dimensions Type 1 and Type 2, methodology for accessing the full history of accounts and transaction information.
  • Involved in Full Life Cycle Development of building a Data Warehouse.
  • Experienced in Performance of Mapping Optimizations in Informatica.
  • Experienced in the Analysis, Design, Development, Testing, and Implementation of business application systems for Pharmaceutical, Financial, Insurance, Telecommunication and Manufacturing Sectors.
  • Used various sources like Sap R/3, Teradata, Oracle, SQL Server and Flat files.
  • Extensive experience in error handling and problem fixing in Informatica.
  • Excellent skills in retrieving the data by writing simple/ complex SQL Queries. Understanding in writing Test plans, Test cases, Unit testing, System testing, Integration testing and Functional testing.
  • Implemented scheduling tools like ctl - m, Autosys flow to automate the ETL process.
  • Exposure towards Cognos, Tableau and Business object reporting tool.
  • Possess good interpersonal, presentation and development skills with strong analytical and problem solving approach.
  • Excellent Team Player, self motivated with good communications skills.
  • Excellent skills in analysis and business requirements gathering from end users to the client and experience in documentation of developed applications.

TECHNICAL SKILLS

ETL Tools: Informatica Power Center 9.x/8x, Data Stage, Ab initio

Databases/Tools: Teradata,Netezza,Oracle 10g/11g/12c, MS SQL Server, MySQL, Sybase, MS Access, DB2 UDB, Toad

BI Tools: Business Objects, Cognos,Tableau, OBIEE, Hyperion

Data modeling: Erwin, Visio, power designer, Embarcadero

Operating Systems: UNIX, Linux, Windows

Progm Languages: C, C++, JAVA, XML, HTML, SQL, Shell Scripting, Python, Perl.

PROFESSIONAL EXPERIENCE

Confidential, NC

Informatica Developer / Data Analyst

Responsibilities:

  • Participated in Joint Application Design/Development (JAD) sessions and involved in discussions with clients/users to determine the dimensions, hierarchies and levels, basic measures, derived measures and key metrics for relational / multidimensional data model for the system, understanding reporting requirements, designing star schema data model. Extensively involved in requirement analysis.
  • Experienced using ERwin for Data modeling for creating Logical and Physical Design of Database and ER Diagrams with all related entities and relationship with each entity based on the rules provided by the business manager. Worked extensively in forward and reverse engineering processes.
  • Participated in Requirement Gathering, Business Analysis, User meetings, discussing the issues to be resolved and translating user inputs into ETL design documents.
  • Worked on the Data Warehouse team analyzing data files that had to be compiled from disparate non-production sources and readied them for production. Tasks included: comparing data to requirements documentation, creation of data layouts, data dictionary, and the pipe delimited and fixed width files. In addition, was team lead for managing the completion and loading of these files for the Data Management group
  • Participated as member of a project team performing various lifecycle tasks for an intranet application with Oracle database for tracking public offering orders. Developed logical and physical database models and data dictionary documentation. Worked as liaison between users and developers refining and explaining user requirements.
  • Assess data quality requirements in terms of data completeness, consistency, conformity, accuracy, referential integrity, duplication and evaluate vendors (Informatica Data Explorer/Data Quality, identify and measure data quality, design and implement data profiling and data quality improvement solution to analyze, match, cleanse, and consolidate data before loading into data warehouse.
  • Develop data quality reports/dashboards with Cognos Framework Manager and Report Studio.
  • Designed the relational data model for operational data store and staging areas, Designed Dimension & Fact tables for data marts.
  • Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas, creating DDL scripts, data dictionary, publishing model to PDF and HTML format, generating various data modeling reports etc..
  • Designed a Conceptual Data Model, Logical Data Model, and Physical Data Model using Erwin.
  • Worked as a Data Warehouse and Database Developer. Core responsibility is to interview with customers and subject matter experts to gather the requirements for a proposed database analyze these requirements and source systems required to build the data warehouse using Data Warehouse Builder.
  • Data Architect in a centralized group doing logical and physical data models in ERWIN. Also, wrote database interface specifications and documented in Data Manager data dictionary.
  • Designed Informatica ETL Mappings documents, created ETL staging area framework, created data mapping documents, data flow diagram, ETL test scripts etc.
  • Extensively used Erwin and Normalization Techniques to design Logical/Physical Data Models, relational database design.
  • Experience with utilities Informatica Data Quality (IDQ) and Informatica Data Explorer (IDE).
  • Conducted data modeling sessions for different user groups, facilitated common data models between different applications, participated in requirement sessions to identify logical entities.
  • Extensively involved in various data modeling tasks including forward engineering, reverse engineering, complete compare, creating DDL scripts, creating subject areas, creating DDL scripts, data dictionary, publishing model to PDF and HTML format, generating various data modeling reports etc..
  • Extensively involved in creating, maintaining, updating documents like requirement document, database design document, System design document, naming standard procedures, SOPs etc.
  • Analyzed and optimized the existing business processes using Conceptual Models and Data Flow Diagram. The process included performance enhancement requirements assessment and mapping of existing work flow and data flows.

Environment: Informatica power center 9.6,Oracle 11g/12c,Teradata,Cognos,,Erwin,PL/SQL,UNIX Shell Scripts, Perl,Python

Confidential, Minneapolis MN

Data Analyst

Responsibilities:

  • Designed data and source-to-target mapping work on the MIDE data warehouse.
  • Worked on mapping and modeling of the data.
  • Checking and validating data in the Mapping document running the queries on Teradata.
  • Documented and maintained RTM (Requirement Traceability Matrix) to ensure that all requirements are traceable from top to bottom, hence provide a basis for test plan.
  • Modified the existing mapping template to a more clear and easy to understand template.
  • Developed ETL mappings, testing, correction and enhancement and resolved data integrity issues.
  • Identified the source and target tables. Developed mapping document using MS excel.
  • Determined the natural keys and surrogate keys for the target tables.
  • Involved working on solution integration relative to the needs of the business.
  • Performed Data alignment and Data cleansing
  • Provided the required transformation logic using Teradata queries.
  • Performed ad hoc Data analysis through UNIX command line.
  • Organized and conducted cross-functional meetings to ensure linearity of the phase approach.
  • Collaborated with a team of Business Analysts to ascertain capture of all requirements.
  • Developed communications plan and message content in MS Office environment.
  • Managed and exceeded counterparts’ and leaderships’ expectations.
  • Worked on analyzing the data statistically and also prepared statistical reports SAS tool.
  • Involved on retrieving the different database and manipulated those data.
  • Worked closely with the data architect to resolve the referential integrity problems.
  • Delivered the artifacts within the stipulated time lines and excelled in the quality of deliverables.

Environment: Powerdesigner,Teradata,Informatica9.5,oracle11g,Mainframe,Cognos, Excel Macros, UNIX, Teradata, SAS

Confidential, Alpharetta, GA

Informatica developer/Data Analyst

Responsibilities:

  • Liaised with business subject matter experts in analyzing business requirements and understand the various data sources.
  • Designed the productivity data mart. Review and maintain the schema, its tables, indexes, views and PL/SQL procedures in Oracle 11g.
  • Build the ETL processes; lead the ETL design by establishing and implementing best practices.
  • Develop strategies for use in high volume, high performance heterogeneous environment.
  • Map source data elements from various systems like - Avaya, CMS-Customer Management System, Outbound dialers, SBR, and IVR to CCPM and develop, test, and support extraction, transformation and load processes.
  • Analyzed source data related to Work force, Call Stats/Productivity, Contact, Contact quality, Training, Hierarchy Final, Coaching. Apply business rules to the data in the raw data mart and load it to the reporting data repository.
  • Defines and captures metadata and rules associated with ETL processes in Informatica.
  • Design and develop the technical/functional specification for ETL development and implement using Informatica. Analyze dependencies for workflow for various ETL processes, handle exception, and maintain logs.
  • Manage artifacts - versions, including software code, documents, design models, and even the directory structure itself.
  • Involved in writing shell scripts for file transfers, file renaming and several other database scripts to be executed from UNIX.
  • Created documents for data flow and ETL process using Informatica mappings to support the project once it implemented in production.
  • Created and Documented ETL Test Plans, Test Cases, Test Scripts, Expected Results, Assumptions and Validations.

Environment: Informatica Power center 9.1, Power designer, Teradata, Oracle 10g/11g, SQL Server 2008, Toad, DB2,, Flat Files,WinNT 4.0 UNIX, Business Objects 6.5/XI.

Confidential, New Brunswick, NJ 

Data warehouse Analyst

Responsibilities:

  • Worked on Informatica - Source Analyzer, warehouse designer, Mapping Designer, Mapplet and Transformations.
  • Involved in designing the procedures for getting the data from all systems to Data Warehousing system. The data was standardized to store various Decision making Units in tables.
  • Data modeling by ER diagrams and process modeling using data flow diagram in the design process.
  • Created UNIX Environment variables in various .ksh files.
  • Analyzed the Technical, organizational, and economical feasibility of the project.
  • Created Informatica mappings with PL/SQL procedures/functions to build decision rules to load data.
  • Extensively used most of the transformations such as the Source qualifier, Aggregators, Lookups, Filters and Sequence.
  • Extensively Used ETL to load data from different databases and flat files to Oracle.
  • Involved in the data migration from COBOL files mainframe-AS400 to DB2.
  • Extensively worked on Database Triggers, Stored Procedures, Functions and Database Constraints.
  • Created the Universes using Business Objects Designer.
  • Created Hierarchies to provide the drill down functionality to the end-user.
  • Created various classes, objects, Filters, Conditions in the Universes.

Environment: Informatica Powercenter 9.x, SQL Server, Oracle 11g,, Embarcadero, UNIXEducation

We'd love your feedback!