We provide IT Staff Augmentation Services!

Data Analyst Resume

4.00/5 (Submit Your Rating)

San Francisco, CA

SUMMARY

  • Over Nine years of experience in IT industry with expertise in Analysis, Design, Development, Implementation, Testing, and Support of Data Warehouse and MDM applications across Telecom, Sales, Retail and Banking domains.
  • Extensive Experience in Informatica Power Center 9.x/8.x/7.x (Designer, Repository manager, Workflow Manager, Workflow Monitor) and Informatica Data Quality tool (IDQ), Informatica Analyst tool.
  • Used various complex PowerCenter Transformations like lookup, Joiner, Expression, Router, Update strategy, Source Qualifier, Aggregator, filter, Sequence Generator, Normalizer to accomplish the mapping design.
  • Excellent knowledge in IDQ transformations like Parser, Labeler, Match, Consolidation and Address validator, Bad Record Exception etc. transformations in Informatica Data Quality tool (IDQ).
  • Strong skills on Informatica MDM (Master Data Management) for golden record and mastering process.
  • Well versed with Relational and Dimensional modeling techniques like Star, Snowflake Schema, OLAP, Dimensional and Fact Tables.
  • Experience in Performance Tuning of Targets, Sources, Mappings and Sessions in Powercenter.
  • Extensive database experience using Oracle 11g/10g/9i, Teradata, MS SQL Server 2012/2008/2005 , Teradata and Greenplum.
  • Developed Complex database objects likeStored Procedures, Analytic Functions and Triggers.
  • Experience with Teradata as the target for the DataMart, worked in Teradata utilities like BTEQ, FastLoad and MultiLoad.
  • Exposure in reporting tools like Salesforce Wave Analytics, OBIEE, SSRS and Tableau.
  • Good knowledge in analytical components using Scala, Spark and Spark stream.
  • Highly effective in maintaining team dynamics and lead team to successful project delivery.
  • Proven ability to work efficiently in both independent collaborative environments with Excellent interpersonal and communication skills.
  • Motivated team player and proficient working in a process centric environment like Agile/Scrum.
  • Experience in developing applications using Java 1.5
  • Sound knowledge on major technologies in Hadoop Ecosystem such as Hive, Pig, HBase and Sqoop.
  • Excellent knowledge in data analysis using HiveQL,R programming.
  • Provided Informatica training at Confidential Greenfield Training (GFT).

TECHNICAL SKILLS

Data warehousing Tools: Informatica MDM 10.0.1,Informatica Power Center 9.x/8.x/7.x, Informatica IDQ 9.6.1,Informatica Power Exchange 9.6.5 and 8.6., Oracle data Integrator (ODI),SSIS

Reporting Tools: Salesforce Wave Analytics, Business Objects R4,SSRS,Tableau

Modeling Tools: Rational Software Architect, Erwin, Visio

Databases: Oracle 11g/10g/9i, MS SQL Server 2012/2008/2005 , Teradata, MS - Access

Programming: Java 1.5,SQL, PL/SQL, UNIX Shell Scripting, Python and Scala

Big Data Technologies: Hive,Pig,HBase,Sqoop and Spark

Scheduling Tools: Autosys, DAC, Control M, Tidal

Versioning Tools: SharePoint, Rational Team Concert, Perforce, Team Foundation Server, GitHub

OS: Win 2000, Win XP, UNIX, Linux, Windows 7,Windows 8

Other Tools: Toad, SQL developer, Soap UI, Google Geocoding API

Application server: JBoss

IDE: Eclipse

PROFESSIONAL EXPERIENCE

Confidential, San Francisco, CA

Data Analyst

Responsibilities:

  • Created data model for data reconciliation framework from different source system.
  • Created ETL design and Development of mappings, workflows, Unit Testing and peer review.
  • Creation of the Base objects, Staging tables and relationships in MDM.
  • Experience with Address doctor and Google API for cleansing party address.
  • Involved in queries and package development.
  • Development of the MDM mapping and cleanse functions.
  • Worked closely with the ETL data architect to populate landing tables in MDM.
  • Conducted analysis sessions with business users to understand matching requirement.
  • Set up match/merge and run match rules to check the effectiveness of MDM process on data.
  • Create views and enhance web services for downstream data consumers.
  • Used SIF APIs (GET, SearchMatch, PUT, CleansePut, ExecuteBatchDelete etc.) to test search, update, cleanse, insert and delete of data from SoapUI.

Environment: Informatica MDM 10.1, IDD, Address Doctor, Oracle 11g, Control M, Informatica IDQ 9.6.1, Informatica Analyst, Google API, Tableau

Confidential, San Francisco, CA

Data Analyst

Responsibilities:

  • Worked closely with the clients to understand the requirements.
  • Created data model and ETL design according to requirements and developed informatica jobs.
  • Developed data visualization reports/dashboards using Tableau.
  • Performed data profiling and data quality checks in IDQ.
  • Create score cards in IDQ and helped Business Analysts to define survivorship rues in MDM.
  • Provided work estimation to clients and managed task allocation to offshore team.
  • Created Reference tables and developed a framework to publish these data to Business analysts to update using Human Tasks in IDQ.
  • Developed IDQ mapplet using Address validator and invoked the same through power center.
  • Used Bad Record Exception transformation in Informatica Data Quality tool (IDQ) to publish the bad data for Business Analyst’s review in Informatica Analyst tool.
  • Development of mappings, workflows, Unit Testing and peer review.
  • Handled code migration to higher environments.
  • Implemented File watcher in shell script and created different shell script for post processing files.
  • Provided production support and fixed production failures with in the SLA.
  • Created audit mechanism for different vendor files.

Environment: Informatica Powercenter 9.6.1, Flat Files, SQL Server 2012, Oracle 11g, Linux, Control M, Informatica IDQ 9.6.1, Informatica Analyst, Informatica MDM 10.1,IDD,Address doctor, Google API, Tableau

Confidential, San Francisco, CA

Sr Informatica developer

Responsibilities:

  • Worked closely with Business Analysts to close the BRD (Business Requirement Document).
  • Created technical design documents and mapping documents for ETL.
  • Created Informatica mappings, sessions and workflows and wave upload jobs.
  • Redesigned existing mappings to improve performance.
  • Created Unix Scripts and modified existing scripts for re usability.
  • Validated json file and XMD file for datasets created by ETL process.
  • Scheduled ETL jobs/shell scripts in Tidal framework.
  • Conducted pre UAT sessions for Business Users.
  • Analyzed issues found in UAT and fixed bugs.
  • Implemented row level security predicate for Sales KOA data sets.
  • Performed data validation in Salesforce Wave Analytic App (Analytics Cloud).
  • Developed Scala scripts, UDFFs using both Data frames/SQL and RDD/MapReduce in Spark 1.6 for Data Aggregation, queries and writing data back into OLTP system through Sqoop.
  • Developed Spark scripts by using Scala shell commands as per the requirement.
  • Loaded the data into Spark RDD and do in memory data Computation to generate the Output response.
  • Performed advanced procedures like text analytics and processing, using the in-memory computing capabilities of Spark using Scala.
  • Experienced in handling large datasets using Partitions, Spark in Memory capabilities, Broadcasts in Spark, Effective & efficient Joins, Transformations and other during ingestion process itself.

Environment: Informatica 9.5.1, Oracle 11 g, Confidential, Power exchange for salesforce, Tidal, Tidal Transporter, UNIX, Salesforce Wave Analytics, Flat files, Perforce, Json editor.

Confidential, San Francisco, CA

Sr Informatica developer

Responsibilities:

  • Analyzed old ETL jobs in Business Object Data integrator and create the design approach.
  • Analyzed the impact on existing data warehouse to make it in-sync with legacy system.
  • Created design document and source to target mapping document for legacy migration.
  • Responsibilities included designing and developing complex informatica mappings including Type-II slowly changing dimensions.
  • Provided design walkthrough to Architects and finalized the design.
  • Extracted data from Salesforce cloud and migrated to new data warehouse.
  • Developed mappings, sessions, and workflows and check in deployment package at Perforce.
  • Created Tidal jobs in Development and Test environment.
  • Did performance tuning for mappings.
  • Performed peer reviews and tested deliverables of peers.
  • Created release notes for each deployment.
  • Supported system integration and provided bug fixes in production.

Environment: Informatica 9.5.1, Business Object Data Integrator, Oracle 11 g, Tidal, Tidal Transporter, UNIX, xml files, Flat files, Perforce, Power exchange For salesforce.

Confidential, Long Island, NY

Sr Informatica Developer

Responsibilities:

  • Worked closely with Business Analysts to ensure data quality of receiving files.
  • Worked withInformaticaData Quality toolkit, Analysis, data cleansing, data matching, data conversion, capabilities of IDQ
  • Identified and eliminated duplicates in datasets through IDQ components of Edit Distance, Jaro Distance and Mixed Field matcher.
  • Used Address doctor to validate and correct address.
  • Designed ETL design according to the file arrival and creates TSD.
  • Extensive extraction of data from Flat Files to Green Plum.
  • Development of mappings, workflows and Unit Testing.
  • Designed and Developed pre-session, post-session routines and batch execution routines.
  • Debugging and Trouble Shooting Of Mappings.
  • Provide Informatica code job walkthrough to QA team and defect fixing.
  • Offshore Coordination of development activities.

Environment: Informatica 9.5.1, Flat Files, Green Plum, UNIX, PostGre SQL, AutoSys, Perforce, Informatica IDQ

Confidential, Bentonville, AR

Lead Informatica Developer

Responsibilities:

  • Converted logical data model to Physical data model in RSA and script generation for DBA’s.
  • Created mapping document (TSD) from high level design.
  • Managed development team in offshore and provide status to senior management.
  • Developed mappings, workflows to pull data from Informix.
  • Deployment of ETL and DB changes to higher environments.
  • Performed Unit Testing and Component Integration testing before handing over the code to QA
  • Supported QA and UAT phases.
  • Designed and developed UNIX shell Scripts.
  • Debugged and fixed bugs in Informatica Mappings.
  • Reviewed mappings and workflows developed by Peers and report the defects.
  • Created documents to hand over the project to support team.

Environment: Informatica 9.1, Oracle11g, Informix, OBIEE, UNIX, RSA, Teradata SQL Assistant.

Confidential

Solution Designer/ Development Lead

Responsibilities:

  • Worked with the Business Users and Business Analysts for gathering functional requirements.
  • Created Data model and Solution design documents and Technical design documents.
  • Designed Universe and Reports for Reporting (Business Objects) environment.
  • Provided support to offshore development team in understanding the Design.
  • Extensively used PL/SQL programming procedures, packages to implement business rules.
  • Created and Configured Workflows, Worklets and Sessions to transport the data to target warehouse tables using Workflow Manager.
  • Optimized the performance of the mappings by various tests on sources, targets and transformations. Developed Procedures and Functions in PL/SQL for ETL.
  • Extensively used ETL to load data from source systems like Flat Files into staging tables and load the data into the target database Oracle.
  • Created and used reusable Mapplets and transformations.
  • Improved performance by identifying the bottlenecks in Source, Target, Mapping and Session levels.
  • Designed and Developed pre-session, post-session routines and batch execution routines.
  • Involved in debugging and Trouble Shooting of Mappings.

Environment: Informatica Power Center 9.1.1, Oracle 10g, UNIX and PL/SQL Developer, HP-IUX, BSCS, Siebel CRM, EAI, IN-Eserve, Business Objects, RSA

Confidential

ETL developer

Responsibilities:

  • Created low level mapping documents from high level design.
  • Creation of mapping, Sessions and workflows.
  • Worked in Topology manager, contexts, physical schemas
  • Created Deployment of workflows to higher environment using Informatica Import Export Utility.
  • Performed mapping Review using Informatica Code Review Tool.
  • Developed Pre and Post SQL scripts, PL/SQL stored procedures and functions.
  • Used Debugger to check the errors in mapping.
  • Generated UNIX shell scripts for automating daily load processes.
  • Creation of test cases and execution of test scripts as part of Unit Testing.
  • Involved in quality assurance of data, automation of processes.
  • Documented the entire process. The documents included the mapping document, unit testing document and system testing document among.
  • Effectively handling change requests by developing, implementing and testing of solutions.
  • Review of mappings and workflows developed by Peers and report the defects.
  • Managed Change control implementation and coordinating daily, monthly releases.

Environment: Informatica Power Center 7.1.4, Oracle 10g, DAC, ODI,UNIX and PL/SQL, OBIEE, Power Exchange

Confidential

ETL developer

Responsibilities:

  • Involved in Business Analysis and requirement gathering.
  • Designed the mappings between sources to operational staging targets.
  • Used Informatica Power Center as an ETL tool for building the data warehouse.
  • Employed Aggregator, Sequencer and Joiner transformations in the population of the data.
  • Used Teradata as a Source and a Target for few mappings. Worked with Teradata loaders within Workflow manager to configure FastLoad and MultiLoad sessions.
  • Data transformed from SQL Server databases and loaded into an Oracle database.
  • Developed PL/SQL and UNIX Shell Scripts for scheduling the sessions in Informatica.
  • Creation of test cases and execution of test scripts as part of Unit Testing.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.

Environment: Informatica Power Center 7.0, SQL Server 2008, Oracle 10g, Teradata, Control M

We'd love your feedback!