We provide IT Staff Augmentation Services!

Information Technology  Analysis Resume

2.00/5 (Submit Your Rating)

PROFESIONAL SUMMARY

  • Over All 8 years of experience in Information Technology in Analysis, Design and Development of software applications. Expertise in development of Data warehousing solutions using Informatics Power Center 10.1/9.6/9.5.1and Power Mart 7.1.1 /6.2, Power connects.
  • Expertise in Implementation of Enterprise Data warehouses using Informatics Power Center 10.1/9.5/ and Power Mart 6.2.1 / 5.x, and Decision Stream
  • Expertise in working with Oracle 12c/11g, SQL server 2014/ 2008, DB2, Sybase and Terawatt.
  • Experience in teh field of Data warehouse using ETL tools such as Informatics Power Center 9.6/8x/7x, Power Mart 9x/8x databases asDB2, Oracle, MS SQL Server andTerawatt.
  • worked with Terawatt utilities like BTEQ, Fast Export, Fast Load, Multi Load to export and load data to/from different source systems.
  • CreatedSSISPackages usingSSISDesigner for export heterogeneous data from OLE DB Source (Oracle), Excel Spreadsheet to SQL Server 2008/2014.
  • Extensively worked on Informatics IDE/IDQ.
  • Design and development of automated processes and/or workflows to integrate information from external sources to drive teh creation and update of Facilities into ACBS system.
  • Applied teh rules and profiled teh source and target table's data usingIDQ.
  • Proficient in warehouse designs based on Ralph Kimball and William Inman methodologies.
  • Knowledge of full life cycle development in Data Warehousing.
  • Sound noledge of database architecture for OLTP and OLAP applications, Data Analysis and ETL processes.
  • Expertise in implementing complex business rules using different transformations, mappings and applets.
  • Working on SQL, PL SQL,Oraclepackages procedures, functions, Triggers andOracledatabase tuning
  • Used Informatics Power Center to Extract, Transform and Load data into Netezza Data Warehouse from various sources like Oracle and flat files
  • Hands on experience using query tools like TOAD, SQL Developer, PLSQL developer, TerawattSQL.
  • Extensively worked with Stored Procedures, Triggers, Cursors, Indexes, Functions and SSIS Packages and even in data cleaning, data profiling using Informatics tools.
  • Experience in UNIX shell scripting, CRON, FTP and file management in various UNIX environments and JavaScript environment.
  • Knowledge of different Schemas (Star and Snow Flake) to fit reporting, query and business analysis requirements.
  • Projects done based on teh Informatics cloud, teh Informatics designer cloud dat consists of designing techniques and cloud data integration dat is huge bulk of data is handled. Teh enterprise data integration which consists of power center and power exchange, teh power center used is Informatics 9.5.1.

TECHNICAL SKILLS:

Data Warehouse Tools: Informatics Power Center 9.6/9.5/8.x/7.x, Data Stage, Informatics Power Analyzer/Power mart 9x, Informatics Data Quality 9.5 (IDQ)

Operating System: Windows XP/Vista 08/07, Unix, Linux, IBM Mainframes, Putty, WinSCP

Databases: Oracle 11g/10g/9i, MySQL, SQL Server 08/05, Terawatt, MS Access

Reporting Tools: SAP BO, SSRS, SSAS,ACBS

Dimensional Data Modeling: Dimensional Data Modeling, Star Join Schema Modeling, Snow - Flake, Modeling, Erwin.

Testing: UAT, Functional, ETL

Data Base Tools: Oracle SQL Developer, SQL Plus, SQL Loader, TOAD, Netezza 6.0/7.0, MS office, Microsoft Visio, OLTP/OLAP

Languages: SQL, PL/SQL, XML, Unix Shell Scripting, Cobol

Dimensional Data Modeling: Dimensional Data Modeling, Star & Snow Flake modeling, Erwin

PROFESSIONAL EXPERIENCE:

Information Technology Analysis

Confidential

Responsibilities:

  • Developed mappings toextractdata from SQL Server, Oracle, Terawatt, SFDC, SIEBEL, Flat files to load into Terawatt using teh PowerCenter.
  • Used Informatics Power Center 10.1,9x/8.6.1/8.5 and its all features extensively in migrating data from OLTP to Enterprise Data warehouse.
  • Proficiency inHL7on standards for interoperability of health information technology.
  • Involved in analyzing and development of teh Data warehouse.
  • Wrote teh shell scripts to monitor teh health check ofHadoopdaemon services and respond accordingly to any warning or failure conditions.
  • Involved in massive data profiling using IDQ (Analyst Tool) prior to data staging.
  • Extensively used Erwin for Logical and Physical data modeling and designed Star Schemas.
  • Extracted data from different sources like Oracle, flat files, XML, DB2 and SQL Server loaded into DWH.
  • Performed tuning and optimization of complex SQL queries using Terawatt Explain.
  • Expert level noledge of complex SQL using Terawatt functions, macros and stored procedures.
  • Experience in upgrading from Informatics version 9.6 to 10.1.
  • Supervise teh installation ofDatastageSoftware and established all teh setups including data base connections, plug-in, configuration file setup, directory setups.
  • Worked on multiple projects usingInformaticsdevelopertoolIDQof latest version 9.5.
  • Performed feasibility study of theHL7-centric data model for teh MVPS staging database.
  • Creating and modifying existing SSIS package fromOracledatabaseto SQLdatabase
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations.
  • DesignedSSISPackages to transfer data from flat files to SQL Server using Business Intelligence Development Studio.
  • Design, document and configure theInformaticsMDM Hub to support loading, matching, merging, and publication of MDM data.
  • Experience in Extraction, Transformation and Loading of data from different heterogeneous sourceDeveloped mappings/Reusable Objects/applets by using mapping designer, and applets designer, transformation developer in Informatics Power Center 10.1/9.6/8.6.1/8.5
  • Used Informatics Power Center 8.6.1/8.5 for extraction, loading and transformation (ETL) of data in teh data warehouse.
  • Involved in Plan, Document, and Develop and lead allHadoopETLdevelopment for teh internal company data warehouse.
  • Designed and DevelopedDatastageJobs usingDatastageDesigner to Import and Export data from heterogeneous data sources Oracle, SQL server, Flat files.
  • Worked on power exchange to create data maps, pull teh data from mainframe, and transfer into staging area. Wrote numerous BTEQ scripts to run complex queries on teh Terawattdatabase.
  • Developed Complex transformations, Mapplets using Informatics to Extract, Transform and load (ETL) data into Data marts, Enterprise Data warehouse (EDW) and Operational data store (ODS).
  • Handled slowly changing dimensions (Type I, Type II and Type III) based on teh business requirements.
  • Parsing a document selectively, dat is, retrieving selected data and ignoring teh rest usingHL7.
  • Used Informatics Power Center Workflow manager to create sessions, batches to run with teh logic embedded in teh mappings.
  • Loaded Data from different sources using PL/SQL routines, UTL FILE, EXTERNAL TABLES to theoracledatabasetables.
  • UsedETL(SSIS) to develop jobs for extracting, cleaning, transforming and loading data into data warehouse. Wrote PL/SQL stored procedures& triggers, cursors for implementing business rules and transformations.

Environment: Informatics Power Center 10x/9x/8.6.1/8.5 (Power Center Repository Manager, Designer, Workflow Manager, and Workflow Monitor), Terawatt 13Terawatt ManagerSQL Server 2012, DB2 8.1, XML, Autosys, Oracle 12c/10g, TOAD, SQL, PL/SQL, UNIX, Active

Terawatt Manager

Confidential

Responsibilities:

  • Developed mappings/Reusable Objects/Transformation/Mapplets by using mapping designer, transformation developer and Mapplets designer in Informatics Power Center 9.6/9.5/9.1.
  • Involved in Design and development of new data warehouse (Analytical Data Warehouse) for better Reporting and analysis.
  • Involved in Data Modeling of theOracleDatabaseand Debugging Stored Procedures.
  • Worked on developing Informatics Mappings, Mapplets, Sessions, Worklets and Workflows for data loads.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence
  • Generator, Update Strategy, Union, Lookup, Joiner, XML Source Qualifier and Stored procedure transformations. Using delimiters to define teh source document structure forHL7standards.
  • Developed Mappings which loads teh data in to Terawatt tables with SAP definitions as Sources.
  • Analyzed complex ETL requirements/tasks and provided estimates/ETCs.
  • Information is stored in table definitions in teh repository and is entered using theDatastage import/export options.
  • And extensively worked on IDQ admin tasks and worked as both IDQ Admin and IDQ developer.
  • CreatedSSISReusable Packages to extract data from Multi formatted Flat files, Excel, XML files into UL Database and DB2 Billing Systems.
  • Worked closely with client on planning & brainstorming to migrate teh current RDBMS toHadoop.
  • Primary activities include data analysis identifying and implementing data quality rules inIDQand finally linking rules to power centerETLprocess and delivery to other data consumers.
  • Design teh ETL process to source teh data from source and load it into ODS tables
  • Worked on UNIX and Oracle SQL Developer to develop queries and create procedures and Package Oracle.
  • Worked with data warehouse staff to incorporate best practices from Informatics.
  • Developed complex mappings and SCD type-II mappings in Informatics to load teh data from various source to ODS tables. Hands on experience with data steward tools like data merge and hierarchy manager in MDM.
  • UsedInformaticsto extract data from DB2,HL7, XML and Flat files to load teh data into teh Terawatt
  • Created and modifiedInformaticsMappings and Workflows to load flat files toOracledatabase.
  • Data Loads to warehouse by extracting data from sources like Oracle and Delimited Flat files.
  • Worked with ETL Migration Team and Migrated Informatics folders from Dev to Test repository
  • Profiled teh data using Informatics Analyst tool to analyze source data (Departments, party and address) coming from Legacy systems and performed Data Quality Audit.
  • Performed Unit testing, Integration testing, and coordinated with QA for UAT for code change and enhancement.
  • Developed mappings inInformaticsto stack teh information from different sources into teh Data Warehouse, utilizing distinctive changes like Source Qualifier,JAVA, Expression, Lookup, Aggregate, Update Strategy and Joiner.
  • Worked onSSISPackage, DTS Import/Export for transferring data from Database (Oracle and Text format data) to SQL Server.
  • Worked with variousIDQtransformations like Standardizer, Match, Association, Parser, Weighted Average, Comparison, Consolidation, Decision, and Expression.
  • Building Data Warehouse/Data Marts using AscentialDatastage7.5.2, 7.5.3 and IBM InfosphereDatastage8.1.0, 8.1.2 and 8.5 Versions.
  • Working with Different Terawatt source systems to analyze data and load it into teh Data warehouse.
  • Performed reconciliation techniques to get huge health care counts.
  • Experienced in runningHadoopstreaming jobs to process terabytes of xml format data.
  • Study and analyze source data systems in terms of business usage and data quality. Active involvement in data model design and MDM configuration proposal and data profiling.
  • Worked on Informatics profiling for huge number of health care data.

Environment: Informatics Power Center 9.5/9.1, CDMA, ABC, Facets, Toad, Tidal, Terawatt, Fast load, Fast export, BTEQ, Unix, Informatics servers on Unix Putty, TOAD, MS VISIO, MS Office Suite, TDQ, Oracle, MVS, oracle 11g, SQL Server, Windows NT/2000.

Confidential

ET Deveoper

Responsibilities:

  • Design and Development of ETL routines, using Informatics Power Center 9.5.1 Within teh Informatics Mappings, usage of Lookups, Aggregator, Ranking, Mapplets, connected and unconnected Lookups, and source filter usage in Source qualifiers and data flow management into multiple targets using Router was extensively done also designed and reported using power connect Informatics from external systems like SAP,DB2.
  • Migrated data from different sources (text based files, Excel spreadsheets, and Access) to SQL Server databases using SQL Server Integration Services (SSIS).
  • Involved in migration ofDatastageprojects and jobs from earlier versions to IBM Info Sphere 8.0.1 version
  • Preparation of technical specification for teh development of Informatics Extraction, Transformation and Loading (ETL) mappings to load data into various tables in Data Marts and defining ETL standards.
  • Developed mappings using Informatics to load teh data from sources such as Relational tables, Flat files, Oracle tables into teh target Data warehouse.
  • Define and build best practices regarding creating business rules within theInformaticsMDM solution
  • Experience with Vertica andTerawattenvironments and its tool set
  • Design and Developed pre-session, post-session tasks for teh workflows.
  • Implemented slowly changing dimensions methodology and developed mappings to keep track of historical data.
  • Used teh ACBS functional application.
  • Extensively used Informatics Power Center Workflow manager to create sessions, workflows and batches to run with teh logic implanted in teh mappings.
  • Expertise in Migration of objects fromOracledatabases to Terawatt.
  • Extensively worked on Mapping Variables, Mapping Parameters, Workflow Variables and Session Parameters.
  • CreatedSSISReusable Packages to extract data from Multi formatted files, Excel, XML files into EDW Database.
  • Used teh Workflow manager to create workflows, Worklets and tasks.
  • Developed Session Parameter files for teh workflows.
  • Used teh Informatics command line utility pmcmd to start and schedule teh workflows by setting a cronjobs and wrote Unix Shell scripts to create pre and post session commands.
  • Code migration from development Repository to Test to Production Repository.
  • Responsible for creating new users and folders for individual developers.
  • Creating BTEQ (Basic Terawatt Query) scripts to generate Keys.
  • Extracted teh data from various Flat files and Loaded in Data warehouse Environment and written Unix Shell scripts to move teh files across teh Servers.
  • Created export scripts using Terawatt Fast Export Utility.
  • Worked on buildingETLdata flows dat works natively onHADOOPand developed multiple Map Reduce jobs in Java for data cleaning and preprocessing
  • Utilized teh Informatics Data quality management suite (IDQ and IDE) to identify and merge customers and addresses.
  • Developed complexDatastagejobs according to teh business requirements / mapping documents.
  • Daily Interactions with End-to-End(E2E) team for defining teh project requirements, based on understanding of teh business needs and halp them in analysis sessions
  • Used Pipeline partitioning to improve Session performance.
  • Upgraded teh Informatics Power Center version from V9.1.0 to V9.5.1
  • Created configured and scheduled teh sessions and Batches for different mappings using workflow manager and using UNIX scripts.

Environment: Informatics Power Center 9.1,Oracle 10g/9i, SQL, PL/SQL, SQL*Plus, Sybase, XML files, Sun Solaris 5.8, Windows 7/8, TOAD 7.0, PL/SQL Developer, Crystal Reports XI, MS Excel, Import Wizard, Central Management Console, Info View, True Task Scheduler

SQL Server Management Studio

Confidential

Responsibilities:

  • Interpreted logical and physical data model for business users to determine common data definitions and establish referential integrity of teh system.
  • Developed strategies for Incremental data extractions as well data migration to load into teh Terawatt.
  • Star Schema was used to design teh warehouse.
  • Experience in Business analysts and teh DBA for requirements gathering, business analysis, designing and documentation of teh data warehouse.
  • Developed BTEQ scripts to load data to Terawatt and developed UNIX scripts to access Terawatt.
  • Worked extensively on Source Analyzer, Mapping designer, Warehouse Designer.
  • Developed several Mappings and Mapplets using corresponding Source, Targets and Transformations.
  • Extensively used almost all Transformations such as Filter, Aggregator, Expression, Router, Lookup, Update Strategy, Sequence Generator, and Rank.
  • Had to createDatastagejobs instantly on demand for moving data from Sybase ASE (OLTP) to GreenPlum and Sybase IQ for testing purposes.
  • Configured and used secured FTP by teh Informatics Server to access source and target files.
  • Schedule, Run and monitor sessions by using Informatica Server Manager.
  • Developed Workflows with various Tasks and Parameter files.
  • Expertise in using different tasks (Session, Assignment, Command, Decision, Email, Event-Raise, Event- Wait, Control).
  • Used ETL for efficient mapping and transformation techniques and processed teh data from various source to Terawatt target tables.
  • UsedDatastageDesigner to develop parallel jobs to extract, cleanse, transform, integrate and load data into Data Warehouse.
  • Defined Target Load Order Plan and Constraint based loading for loading data appropriately into multiple Target Tables.
  • Server Administration (startup / shutdown, configuring teh services) and system Maintenance (Log analysis, backups and archival).
  • Migration of Informatics components from development to QA and Production Environment.

Environment: Informatica 8x/7x/5.2/6.2, ORACLE 8i/9i, UNIX, Windows NT 4.0, UNIX Shell Programming, PL/SQL, Business Objects 4.7, TOAD Quest Software, Business Objects XI R3.1/RI 2.0, Business View Manager,, MS Excel, Live Office.

Confidential

PL/SQL, Business

Responsibilities:

  • Worked on Designing, Development and Testing of Workflows and Work lets according to Business Process Flow.
  • Created mappings using various transformations like update strategy, lookup, stored procedure, router, joiner, sequence generator and expression transformation.
  • Extensively involved in writing ETL Specifications for Development and conversion projects.
  • Created technical design specifications for data Extraction, Transformation and Loading (ETL).
  • Extensively worked on Change Data Capture/Incremental loading of SCD Type I/II.
  • Design theETLprocess and schedule teh stage and mart loads for teh data mart.
  • Worked on Informatics Utilities Source Analyzer, warehouse Designer, Mapping Designer, Mapplets Designer and Transformation Developer for defining Source and Target, and coded teh process from source system to data warehouse.
  • Terawatt views has been developed against teh departmentaldatabaseand claims enginedatabaseto get teh required data.
  • Extensively used transformations like Stored Procedure, Connected and Unconnected lookups, Update Strategy, Filter transformation, Joiner transformations to implement business logic.
  • Executed, scheduled workflows using Informatics Cloud tool to load data from Source to Target.
  • Involved in Data Extraction from Oracle and Flat Files using SQL Loader Designed and developed mappings using Informatics Power Center
  • Extensively created Re-usable Transformations and Mapplets to standardized Business logic.
  • Extracted teh data from various Flat files and Loaded in Data warehouse Environment and written Unix Shell scripts to move teh files across teh Servers.
  • Used Informatics Workflow Manager to create workflows, database connections, sessions and batches to run teh mappings.
  • ETL includes teh selection criteria to extract data from source systems (Cloud), performing any necessary data transformations or derivations needed, data quality audits, and cleansing. Experience using teh Informatics Web Services connector is critical.
  • Used Variables and Parameters in teh mappings to pass teh values between mappings and sessions.
  • Worked on building ETL data flows dat works natively on HADOOP
  • Used UNIX Shell Scripts to automate pre-session and post-session processes.
  • Created shortcuts for reusable source/target definitions, Reusable Transformations, Mapplets in Shared folder.
  • Imported teh IDQ address standardized mappings into Informatics Designer as a applets.
  • Coordinated with testing team to make testing team understand Business and transformation rules dat has been used throughout ETL process.

Environment: Informatics Power Center 9.6, Informatics IDQ 9.1, Oracle 11g, SQLServer2008, IBM (DB2), MS Access, Windows XP, Toad

SQL developer

Confidential

Responsibilities:

  • Coordinate with teh application development team working on Java to provide them with teh necessary stored procedures and packages and insight into teh data.
  • WrotePL/SQL procedures, packages, triggers. Involved in creation of databases, moved databases by creating control files, export/import, and complete backups.
  • Created and maintained Oracle schema objects likeTables, Indexes, Sequences and Synonyms.
  • Identified performance issues in existing sources, targets and mappings by analyzing teh data flow, evaluating transformations and tuned accordingly for better performance.
  • Designed Logical and Physical data, defined relationships, and implemented business rules as constraints fornormalization.
  • WroteSQL*Loaderscripts to migrate teh data from teh text files, Spreadsheet etc. and populated teh intermediate tables.
  • Involved inData MigrationandData Transfer.
  • Involved in writing complex SQL queries using Analytical functions in OracleOLAP.
  • Utilized Qwest tools (Toad 8.0, SQL Navigator)for database monitoring and tuning.
  • Involved inETL processesto load data from Flat files, XML, SQL Server, Access and Excel spreadsheets into teh target Oracle database by applying business logic on transformation mapping for inserting and updating records when loaded.

Environment: SQL Server, TOAD 8.0, Packages, Triggers, Indexes, XML, Oracle, OLAP, Normalization, UNIX, Windows 7/8.1, UNIX Shell Programming.

We'd love your feedback!