We provide IT Staff Augmentation Services!

Associate Resume

3.00/5 (Submit Your Rating)

RestoN

SUMMARY:

  • 12 years of experience in Datawarehousing’s ETL/Reporting technologies
  • Expertise in handling various aspects of Business Intelligence viz. Data Warehouse Design, Development, Administration and Architecture, ETL Administration, Development and Production Support, Informatica PowerCenter / Data Quality / B2B / Master Data Management, Tableau, Abinitio, Oracle PL/SQL, Teradata PL/ SQL, Netezza PL /SQL and UNIX Shell Scripting
  • Experience working in most of the tools in Informatica for handling various aspects of Data Warehousing
  • Experienced in interacting with business users to clarify business requirements
  • Creating Logical/Physical Data Models based on requirements
  • Designed Datawarehouses/Data Marts/Data store as per business requirements.
  • Extensively worked in developing ETL for supporting Data Extraction, transformation and loading using Informatica Power Center, Data Quality ( Workflow Manager, Workflow Monitor, Designer, Repository manager,Developer, Analyst etc )
  • Established integration with wide variety of data sources such as Salesforce, Netsuite, PostGres, SQL Server, Netezza, Teradata, Oracle, Flat files, Mainframe s etc
  • Install/Upgrade Informatica versions 9.5.1/ 9 .6. 1 /10.1 involving Powercenter/Data Quality/ Metadata Manager services
  • Experienced in installing Informatica cloud environments and doing development to integrate with wide variety of data sources including Salesforce
  • Experience working with Hadoop distributions on Amazon web services (AWS) and good understanding of Hadoop architecture
  • Experience in working with Map Reduce programs, Pig scripts and Hive commands
  • Experience working on Batch data ingestion into Hadoop Eco System using Apache SQOOP from RDBMS Netezza, Oracle & Teradata
  • Experience in working on Big data technologies such as Apache Hadoop, HDFS, Map Reduce, YARN, Hive, PIG and SQOOP
  • Wrote scripts in python for automating shakeout of Informatica environments from the application point of view
  • Good requirements containment skills, and efficient project management ensuring containment of scope, deliverable management and ensuring high customer satisfaction.
  • Experience in implementation of Master data management solutions using Informatica MDM
  • Worked on MDM CMX Hub configurations - Data modeling & Mappings, Data validation, Match and Merge rules, Hierarchy Manager, customizing/configuring Informatica Data Director (IDD) .
  • Familiar in configuring Match & Merge setup in MDM Hub - Match Path Components, Columns, Match Rule Sets by defining all suitable properties of Fuzzy and Exact Match concepts.
  • Performed all kinds of MDM Hub jobs - Stage Jobs, Load Jobs, Match and Merge Jobs using the Batch Viewer and Automation Processes.
  • Developed the mappings to load the Data from Landing to Staging by using various cleanse functions.
  • Built Tableau dashboards by importing data from variety of datasources
  • Implemented filters, parameters for preparing dashboards in Tableau
  • Developed dashboards in Tableau Desktop and published them on to Tableau Server that allowed end users to understand the data on the fly with the usage of quick filters for on demand needed information.
  • Scheduled data refresh on Tableau Server for weekly and monthly increments based on business change to ensure that the views and dashboards were displaying the changed data accurately.
  • Experience in creating different visualizations using Bars, Lines and Pies, Maps, Histograms,Bullets, Heat maps, scatter plots, Gantts and Bubbles
  • Extensive experience in tuning the ETL components from memory, C.P.U, IO and Network perspective. Hands on experience in working with tuning parameters like DTM Buffer size, Buffer block size etc.
  • Worked extensively in building Dimensions, Facts, and Star Schemas and Snow Flake Schemas
  • Comprehensive experience of working with Type1, Type2 and Type 3 methodologies for Slowly Changing Dimensions (SCD) management
  • Extensive experience in writing UNIX automation scripts using Informatica Command Line Interface (CLI) commands viz. pmcmd, pmrep, infacmd.sh, infasetup.sh and server monitoring commands viz. iostat, vmstat, mpstat, top
  • Knowledge of implementing Big data using Informatica BDM (Big data management)
  • Worked on open source ETL technologies such as JASPERETL (ETL tool) and MySQL
  • Worked on Microsoft technologies such as SSIS and SQL Server.
  • Have knowledge of Web Technologies including Java and .NET.
  • Setup of Composite Environments along with load balancing.
  • Experience of leveraging ITIL V3 processes and dealing with Incidents/changes/problem(RCA) tickets and resolving them as per SLA
  • Experience in leading teams to successful project implementation with proper management, engagement with top management, scope containment, and Quality assurance.
  • Experience working in Onshore/Offshore model. Experience leading a team of developers and Administrators.
  • Experience in providing On-Call support during off Business hours/weekends
  • Ability to achieve organizational integration, assimilate job requirements, employ new ideas, concepts, methods, and technologies.
  • Excellent communication, interpersonal and analytical skills.
  • Self-motivated, quick learner and adaptive to new and challenging technological environments.

TECHNICAL SKILLS:

ROJECT MANAGEMENT: METHODOLOGIES, Agile, Kanban, Waterfall

ETL Tools: Informatica Power Center 7.1/8.6/9.1/9.6/10.1, Informatica Power Exchange, Informatica Data Quality 9.1/9.5/10.1, Informatica cloud, Metadata Manager, Masterdata Management, Power exchange for Mainframe/SAP/Webservices/Netezza/Teradata, JasperETL.SQL Server Integration Services(SSIS).

DATABASES/DATASTORE: Salesforce, Netsuite,Oracle 12c/11g/10/9/8, Teradata 13.0, Netezza,MySQL, Microsoft SQL Server 2005/2008, DB2, Mainframe, PostGRES

REPORTING TOOLS: Tableau

BIG DATA TECHNOLOGIES: Apache Hadoop, HDFS, Map Reduce, YARN, Hive, PIG and SQOOP

EAI Tools: Composite 6.1

CLOUD: Amazon webservices, IBM Soft Layer

SCHEDULING TOOLS: Autosys, Crontab, UC4

STORAGE: NAS, SAN, Veritas, GPFS and DDN

TOOLS: /LANGUAGES: SQL, PLSQL, UNIX Shell Scripting, TOAD, Dot Net, Java, XML, HTML, Visual Source Safe, subversion, C++, Microsoft Project plan, VisioOPERATING SYSTEMS: UNIX, RedHat Linux 5.8, Suse LINUX, Solaris, Windows 10/7

PROFESSIONAL EXPERIENCE:

Associate

Confidential, Reston

Responsibilities:

  • Discuss business requirements with end users
  • Designed and Developed Informatica powercenter/Data Quality/Metadata Manager code for Multi Family applications.
  • Created Informatica PowerCenter mappings Jobs using complex transformations in Informatica PowerCenter viz. Java, Xml source qualifier, parser, XML Generator, Web Service Provider, Web Service Consumer, HTTP, Unstructured data Transformation and Transaction control
  • Worked on Data Quality / Informatica Analyst Components viz. Profiles, Reference Tables, Mapping Specifications, Rule specifications, Workflow application, SQL Service applications, Web Service applications, Human Task
  • Worked on Informatica B2B, PowerCenter Unstructured data Transformation (UDT) and made use of Mapper, Parser and Streamer components for working with XML files.
  • Implemented Informatica static / dynamic partitioning with additional partition points on session on grid.
  • Set the standards and best practices for Informatica code development
  • Building code for Informatica webservices(Consumer/Provider) and testing via SOAP. Support for Java integration to Informatica web services using REST API’s
  • Developed UNIX KSH scripts to create Informatica parameter file dynamically, SFTP file from source server, validating the data, archival and also kick off the Informatica jobs in batch mode.
  • Configured Landing Tables, Base Objects, Relationships, Staging Tables, Mappings, custom cleanse functions, Match and Merge settings, Trust and Validation Rule.
  • Identified issues in the ETL load to reflect the current data model relationships.
  • Configured Entity Objects, Entity Types, Hierarchy and Relationship Types for Contract, Product, Party Hierarchical view in IDD
  • Build tableau dashboards by importing data from variety of datasources
  • Implemented action filters, parameters for preparing dashboards in Tableau
  • Developed dashboards in Tableau Desktop and published them on to Tableau Server that allowed end users to understand the data on the fly with the usage of quick filters for on demand needed information.
  • Scheduled data refresh on Tableau Server for weekly and monthly increments based on business change to ensure that the views and dashboards were displaying the changed data accurately.
  • Experience in creating different visualizations using Bars, Lines and Pies, Maps, Histograms,Bullets, Heat maps, scatter plots, Gantts, Bubbles and Highlight tables

Environment: Informatica 951/961/10.1 (Powercenter/Data Quality), B2B, MDM 9.7.1, Abinitio 3.1.7, Oracle 11g/12c, Netezza 7.0.4, Sql server 2012, GPFS, Veritas, MyServices, Remedy, service Now, Autosys R11, Informatica Dynamic Data Masking, Data Validation

Customer Datawarehouse Informatica Architect/Admin/Developer

Confidential, Philadelphia

Responsibilities:

  • I was the Single point of contact for Interaction with Business users, Designing Datawarehouse, Identify ETL tool, installation along with development/support of Informatica code
  • Analyze data from various data sources such as Netsuite, Netsuite CRM and Neatcloud(Mongo DB) data
  • Identify right tools and configurations for accessing the data from Netsuite and Neatcloud
  • Explored various ETL tools and identified Informatica Powercenter Enterprise edition tool for CDW project.
  • Executing proof of concepts on PowerCenter Express tool and check whether it will meet all client requirements or not.
  • Designed and implemented Informatica Powercenter 9.6 infrastructure on windows server 2012
  • Designed and Developed Star Schema for Customer Datawarehouse on MySQL.
  • Development and Testing of mappings for Dimensions such as Customer, Product etc and Facts such as Transactions etc.
  • Leveraged Perl scripting on automating Informatica Daily load processes
  • Developed custom scripts to extract files kept on Amazon S3.
  • Backup and restore of Informatica repositories.
  • Create Users and Groups for granting access to users.
  • Migration of Informatica code across DEV/TST/PROD environments.
  • Performance tuning of code.

Environment: Informatica 9.6 Enterprise/Express, Netsuite, MySQL, JIRA, Windows Server 2012. Perl

Confidential, Chesterbrook, PA

Sr Informatica Consultant

Responsibilities:

  • Administration of 9.0.1/9.5 Informatica infrastructure which includes PowerCenter, Data Quality, Powerexchange etc
  • Creating Groups, roles, privileges and assigned them to each user group.
  • Creating Folders and manages user and user group access to objects in the repository
  • Migration of objects in all phases (DEV, SIT, Prod and Post Prod) of projects
  • Used JIRA for tracking work such as Service Request, Incident Management and Work Requests.
  • Administered Model Repository service, Data integration service and Analyst services for Informatica Data Quality 9.1
  • Stopping/starting Informatica Repository and Server processes.
  • Worked with clients / storage /Unix / DBA teams to resolve issues
  • Solved various Production issues of Application teams
  • Opening Informatica SRs (support requests) and working along with the Informatica Global support to troubleshoot failures
  • Responding and Resolving Requests, incidents, problems submitted by Application teams within SLA
  • Prepared documents for Requirements, Design, Unit Testing, Deployment etc
  • Designed and developed ETL Mappings, Sessions and Workflows based on the requirements
  • Maintenance of UC4 infrastructure.

Environment: Informatica 9.0.1/9.5, UC4 V6/V9, Informatica Data Quality 9.0.1/9.5, JIRA, LINUX, putty, Netezza

We'd love your feedback!