We provide IT Staff Augmentation Services!

Data Analyst/etl(informatica) Developer Resume

3.00/5 (Submit Your Rating)

Portland, OregoN

SUMMARY:

  • Confidential professional with around 4.6 years of experience that composes of strong Technical and Problem - solving skills in Business Intelligence, ETL Informatica Power Center and database developer.
  • Worked with Confidential versions 15/14/13, Informatica Power Center 10.1/9.5/9.1/8.6 , Informatica Data Quality (IDQ) 9.5/9.1 as ETL tool for extracting, transforming and loading data from various source data inputs to various targets.
  • Worked extensively with Confidential utilities - Fast load, Multi load, Tpump and Confidential Parallel Transporter (TPT) to load huge amounts of data from flat files into Confidential database.
  • Broadly used Fast export to export data from Confidential tables.
  • Generated BTEQ scripts to invoke various load utilities transform the data and query against Confidential database.
  • Created proper PI taking into consideration of both planned access of data and even distribution of data across all the available AMPS.
  • Extensive experience in integrating data from flat files - fixed width, delimited, XML, Web Services by using various transformations available in Informatica such as - Source qualifier, XML parser, and Web services consumer transformation.
  • Performed with various user groups and developers to define TASM Workloads, developed TASM exceptions, implemented filters and throttles as needed.
  • Strong expertise in designing and developing Business Intelligence solutions in staging, populating Operational Data Store (ODS), Enterprise Data Warehouse (EDW), Data Marts / Decision Support Systems using Informatica Power Center 9.x/8.x/7.x/6.x ETL tool.
  • Experienced in Repository Configuration/using Transformations, creating Informatica Mappings, Mapplets, Sessions, Worklets, Workflows, Processing tasks using Informatica Designer / Workflow Manager to move data from multiple source systems into targets.
  • Experienced in Installation, Configuration, and Administration of Informatica Power Center 8.x/7.x/6. x.
  • Expertise in creating databases, users, tables, triggers, macros, views, stored procedures, functions, Packages, join and hash indexes in Confidential database.
  • Industry experience in using query tools like TOAD, SQL Developer, PLSQL developer, Confidential SQL Assistant and Query man.
  • Expertise in Master Data Management concepts, Methodologies and ability to apply this knowledge in building MDM solutions
  • Magnificent experience with Scheduled and ran Tivoli Workload Scheduler (TWS V.8.4) job streams and jobs requested by Applications support. Created streams and jobs for day and night batch runs.
  • Worked in Tableau environment to create dashboards like Yearly, monthly reports using Tableau desktop & publish them to server. Converted Excel Reports to Tableau Dashboard with High Visualization and Good Flexibility.
  • Proficient in performance analysis, monitoring and SQL query tuning using EXPLAIN PLAN, Collect Statistics, Hints and SQL Trace both in Confidential as well as Oracle.
  • Experience working with Confidential PDCR utility.
  • Excellent Experience with different indexes (PI, SI, JI, AJIs, PPI (MLPPI, SLPPI)) and Collect Statistics.
  • Wrote Confidential Macros and used various Confidential analytic functions.
  • Extensive knowledge of data warehouse approaches - Top down (Inmon's approach) and Bottom up (Kimball's approach), methodologies- Star Schema, Snowflake
  • Good knowledge on Confidential Manager, TDWM, PMON, DBQL.
  • Expertise in transforming data imported from disparate data sources into analysis data structures, using SAS functions, options, ODS, array processing, macro facility, and storing and managing data in SAS data files.
  • Extensively used various Informatica Power center and Data quality transformations such as - source qualifier, aggregator, update strategy, expression, joiner, lookup, router, sorter, filter, web services consumer transformation, XML Parser, address validator, comparison, consolidation, decision, parser, standardizer, match, merge to perform various data loading and cleansing activities
  • Extensive knowledge on scheduling tools - Control-M, Autosys, Tivoli (TWS), ESP and CRON.
  • Extensively used Control-M enterprise manager to schedule jobs, perform initial data loads, data copy from one environment to another when the environment is initially setup.

EXPERIENCE:

Confidential, Portland, Oregon

Data Analyst/ETL(Informatica) Developer

Responsibilities:

  • Develop and communicate a deep understanding of product metrics.
  • Able to identify any data integrity issues in generating any of metrics.
  • Work with product stakeholders (product analytics, PMs, engineers) for identifying the root cause of any data integrity issues and suggested changes to generate product insights.
  • Perform analytical deep-dives to identify problems, opportunities and specific actions required.
  • Leverage data driven insights in the product brainstorming, road-mapping, and trade-off discussions.
  • Work together with product management, engineering, design, policy, and senior executives to rapidly execute, learn and iterate.
  • Can roll up your sleeves and work with large quantities of data by using SQL, Confidential or other data and statistical tools as well as reporting tools such as tableau.
  • Extensively used the Confidential utilities like BTEQ, Fast Load, Multiload, DDL Commands and DML Commands (SQL).
  • Written the Multithreading and synchronization scripts
  • Written the code for Database connection handling using JDBC Continuous Interaction with Business Team.
  • Worked with Informatica power center Designer, Workflow Manager, Workflow Monitor and Repository Manager.
  • Developed and maintained ETL (Extract, Transformation and Loading) mappings to extract the data from multiple source systems like Oracle, SQL server and Flat files and loaded into Oracle.
  • Developed Informatica Workflows and sessions associated with the mappings using Workflow Manager.
  • Involved in creating new table structures and modifying existing tables and fit into the existing Data Model.
  • Extracted data from different databases like Oracle and external source systems like flat files using ETL tool.
  • Implementing the COP Dashboard implementation in java.
  • Developed TPT scripts to load data from Load Ready Files to Confidential Warehouse.
  • Used BTEQ and SQL Assistant (Query man) front-end tools to issue SQL commands matching the business requirements to Confidential RDBMS.
  • Used SQL to query the databases and do as much crunching as possible in Confidential, using very complicated SQL Query optimization (explains plans, collect statistics, data distribution across AMPS, primary and secondary indexes, locking, etc.) to achieve better performance
  • Monitoring database space, identifying tables with high skew, working with data modeling team to change the Primary Index on tables with High skew.
  • Tuning of Confidential SQL statements using Explain analyzing the data distribution among AMPs and index usage, collect statistics, definition of indexes, revision of correlated sub queries, usage of Hash functions, etc.
  • Flat files are loaded into databases using Fast Load and then used in the queries to do joins.
  • Used Confidential SQL with BTEQ scripts to get the data needed.
  • Created proper PI taking into consideration both planned access and even distribution of data across all the available AMPS.
  • Used Confidential utilities (Fast Load, Multiload) to load data into Target Data Warehouse and used Confidential Sql Workbench to Query data in the target Confidential data warehouse.
  • Responsible for developing data pipeline using HDInsight, flume, Sqoop and pig to extract the data from weblogs and store in HDFS.
  • Involved in migration of ETL processes from Oracle to Hive to test the easy data manipulation.
  • Managed log files, backups and capacity.
  • Found and troubleshot Hadoop errors

Confidential

ETL(Informatica) Developer

Responsibilities:

  • Successfully executed various projects like:
  • PRPD (Partial Redistribution Partial Duplication) by developing test cases for functional & performance testing and coding modules across different levels of the Confidential Database in 2012.
  • Confidential Columnar (Column Partitioning) project by developing test cases for functional & performance testing and coding of modules across different levels of the Confidential Database in 2011
  • Designed and implemented appropriate ETL mappings to extract and transform data from various sources to meet requirements.
  • Designed and developed Informatica ETL mappings to extract master and transactional data from heterogeneous data feeds and load
  • Installed and Configured the Informatica Client tools.
  • Worked on loading of data from several flat files to XML Targets.
  • Designed the procedures for getting the data from all systems to Data Warehousing system.
  • Created the environment for Staging area, loading the Staging area with data from multiple sources.
  • Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.
  • Used workflow manager for session management, database connection management and scheduling of jobs.
  • As a member of dbsfrontline, provided support to the various teams in identifying the database issues and solving the issues.
  • Provided the support for Customer Support Team and handled the escalated issues.
  • Acknowledged with Night on the Town (NOTT) twice for outstanding contribution in Confidential Columnar and PRPD Projects in 2012.
  • Enhanced the testing validation procedure by developing the Stored Procedure's using XML Plan Developed Perl scripts to automate the quantum plan comparison testing procedure
  • Developed plan comparison tool using C language to compare the query plans and print the plan in tree format

We'd love your feedback!