We provide IT Staff Augmentation Services!

Data Integration Specialist Resume

3.00/5 (Submit Your Rating)

Ann Arbor, MI

SUMMARY

  • Data Integration specialist having more than 12 years of work experience on leading, designing, delivering robust, scalable, and highly available Data warehouse and Business Intelligence solutions.
  • Involved in data modeling using IBM infosphere Data Architect tool and data profiling performed using Informatica Data Quality and BI reporting technique for various reports to improve the business performance.
  • Built the Enterprise Data Lake and Data Warehouse
  • High attention to detail, organization, team - orientation, and an able to work under tight deadlines
  • Expert in Informatica client tools, willing to develop and test ETL Mappings and help the team to adhere to the Industry standards.
  • Expert in building Data Migrator (Information Builders) ETL job to pull data from various sources (SQL Server, Oracle, Google Analytics, Rest API).
  • Strong experience in writing complex SQL queries, Data Modeling and ETL Strategy
  • Experience in AWS services such as S3, Kinesis, Redshift, Lambda and Cloud watch
  • Work closely with Business Agile Partners, Stakeholders, Solution Architect, Data Modelers to understand business Requirements and Provide DWBI Solutions in a timely cost-effective manner with the company’s architecture and SAFE Agile standards.
  • Having experience in working on various file types like delimiter files, XML and JSON files.
  • Hands-on experience in handling online messages like AMQ, REST API’s and Webservices.
  • Knowledge and Hands-on experience in Hadoop tools like HDFS, MapReduce, Hive, Pig, Kafka and Hortonworks framework.
  • Strong experience in writing advanced SQL, Performance optimization in Informatica and Talend tools, Problem-Solving skills to analyze and resolve issues.
  • Ensure timely and accurate resolution of production issues and monitoring the event logs and Server error logs for troubleshooting purposes.
  • Participates in release management activities & manages all tasks associated with project transitions
  • Proactively keeps customer informed of how and when problems are resolved with focus on retention
  • Gather requirements from stakeholders including business subject matter users and IT teams
  • Document data migration requirements, including source system analysis and source to target
  • Analyzes & acquires data from primary and secondary data sources - creating mapping specifications for use by ETL development resources
  • Involved in creating data mapping between salesforce CPQ and Data warehouse environment.
  • Documents plans for collection, data cleansing, and normalization of data
  • Have proven experience in System analysis, design, development, Data Warehouse/Data Mart design, RDBMS and implementation of client server applications in SDLC and Agile methodology.
  • Used IDQ tool for address validation using address doctor reference data.
  • Involved in data analysis, profiling and cleaning the data like labeling, standardization, preparing Reference data and data consolidation for golden record using IDQ.
  • Having good experience in preparing reusable objects like ETL Transformation, Session, Mapplets, Worklets and User Functions.
  • Delivered most of the projects under Agile Methodology.
  • Over THREE years of experience in EDI conversions for 837I, 837P and NCPDP Government Submissions.
  • Extensive experience with various scheduling tools like CA7 & Maestro & Appworx scheduler.
  • Strong in performance tuning of complex SQL queries.
  • Very flexible and can work independently as well as in a team environment.
  • Involved in Preparing HLD, LLD, Test cases, Test Plan, Cut over Plan and ETL Estimation.

TECHNICAL SKILLS

Operating System: UNIX/Linux, MS-DOS, Windows 2008/2007/ NT/2000/XP.

ETL Tools: Talend 6.x, 7.1.1, Informatica Powercenter 9.6.1/8.5.1/7. x/6.x, PowerExchange for MainFrames VSAM file, Informatica Data Quality (IDQ), Informatica Analyst, Master Data Management, Talent.

Data Modeling Tools: Erwin, ER Studio, IBM Infosphere Data Architect 9.1.3.

BI Tools: Cognos (Framework Manager, Query studio, Report Studio), Qlikview, POWER BI.

Database: Teradata, MS SQL Server 2008, DB2, Oracle 11g.

Language/Scripting: UNIX Shell Scripting.

Other Tools: Ultra-Edit, Altova XML Spyware, TOAD, SQL Developer, WinSCP, Putty, Quick View, IBM Rational Clear Case, Clear Quest, Build forge, Power Designer, IT2B, Teradata SQL Assistance, Microsoft Visio (Visio Soft Diagram), Control-M, TOAD, IBM Maestro - Scheduler, AQT.

PROFESSIONAL EXPERIENCE

Confidential, Ann Arbor, MI

Data Integration Specialist

Responsibilities:

  • Involved in understanding Business Process and coordinating with Business-users to get specific user requirements to build the Data Marts.
  • Documented user requirements translated requirements into system solutions and develop implementation plan and schedule.
  • Integrating Adobe click streaming data for more analysis.
  • Building the Driver app to track the store order delivery with the graphical structure.
  • Big data components used like Hive, HDFS and spark for handling huge data.
  • Modeled the Star Schema Data Marts by identifying the Facts and Dimensions using Idera Data modeling tool.
  • Created jobs in Talend Data Integration tool to extract from heterogeneous data sources like SQL Server 2016, AMQ messages and Oracle 10g.
  • Involved in the handling the various store files across domestic and internationally.
  • Data are shared across different system using AMQ messages and JSON files.
  • Provide the 24/7 production support on various applications.
  • Used Talend Data Integration tool to develop processes for extracting, cleansing, transforming, integrating and loading data into Data Warehouse database.
  • Ensure timely and accurate resolution of production issues and monitoring the event logs and Server error logs for troubleshooting purposes.
  • Monitored and scheduled existing/new jobs on production environment.
  • Builds and maintains strong relationships with business partners and technology teams to identify process improvement opportunities.
  • Knowledge of incident, problem management, ticket, change, and risk management processes and tools
  • Serves as primary support liaison between company and customer and documents incidents in required tracking systems.
  • Create and modify reports using various reporting and analytical tools for data mining and ad hoc reporting as well as identify areas where efficiencies can be achieved through automation.
  • Responsible for being the liaison with Development and other departments to insure proper handling of customer issues.

Environment: Informatica PowerCenter, MS SQL Server 2017, Talend Studio, AWS, BIG Data, Control-M

Confidential, Columbus, OH

Data Analyst

Responsibilities:

  • Gather requirements from stakeholders including business subject matter users and IT teams
  • Document data migration requirements, including source system analysis and source to target
  • Communicating/Connecting the BRD and development team with proper process flow and mapping document.
  • Involved in building the data warehousing for maintain the policy information with snowflake schema.
  • Involved in creating dimension and fact tables.
  • Involved in System analysis, design, development, Data Warehouse/Data Mart design, RDBMS.
  • Identified and finalized various source systems to fetch the requisite information.
  • Involved in performance tunning by writing complex and effective SQL.
  • Involved in table partition for better performance.
  • Key activities include managing deliverables, managing cross location teams, onboarding/offboarding of resources & attrition
  • Client management to capture expectations and draw roadmap with regular milestones
  • Created unit test cases and test scripts to validate data.
  • Review the detail level design and ensure that the IT standards are followed
  • Created complex Informatica mappings, mapplets and re-usable transformations and prepared various mappings to load the data into different stages like landing, staging and target tables.

Environment: Data Quality, Oracle 11g, IBM infosphere Data Architect, Salesforce CPQ, MS SQL Server 2008, Windows XP, UNIX, Flat Files.

Confidential, Richmond, VA

Technical Lead

Responsibilities:

  • Involved in data analysis, profiling legacy data and cleaning the data for better data quality.
  • Used address doctor reference data for client address validation using MDM.
  • Involved in preparing the Reference data for maintain the data standardization.
  • Build the golden record using data consolidation concept in IDQ for better data quality.
  • Creation of the high level design and tool deployment architecture
  • Understanding the user requirement and preparing the sourcing spec for source system.
  • Involved in source system call to make the source team to understand the requirement of development.
  • Understanding the requirements and build end to end informatica development mapping spec.
  • End to end Mapping, session, workflow development and migrating to UAT and Prod environment.
  • UNIX script development for file to stage layer by not using ETL tool.
  • Understanding the data and finding the respective joins and master and dimension table. Pull the data by reducing the complexity.
  • Worked as a solution architect for performance issue and new design approach.
  • Mapping enhancement for production SR and IR.
  • Unit testing, Unit test case document preparation.
  • Adhoc mapping development for one-time load.
  • Fixed the Performance issue in some of the ETL code and SQL.

Environment: Informatica powercenter 9.6, Data Quality, Oracle 11g, Green plum, Appworx, Windows XP, UNIX, Flat Files.

Confidential, Cary, NC

Technical Lead

Responsibilities:

  • Eservice is one of the downstream applications which will surface the Confidential contract information and daily transaction. Application will display the data in form of drill down and drill up.
  • Understanding the Legacy and Eservcie downstream application and data flow.
  • Gather requirements from stakeholders including business subject matter users and IT teams
  • Document data migration requirements, including source system analysis and source to target
  • Communicating/Connecting the BRD and development team with proper process flow and mapping document.
  • Involved in requirement gathering and channeling the envestnet data to Eservcie application.
  • Involved in Preparing HLD, LLD, Test cases, Test Plan, Cut over Plan and ETL Estimation.
  • Working with Informatica power Warehouse Designer, Transformation designer, mapping designer and Workflow Manager to develop new Mappings & implement data warehouse.
  • ETL design performed using various informatica powercenter transformations like Source qualifier, expression, router, normalizer, various kinds of lookup’s like connected, unconnected, lookup’s.
  • Running the status call to the business and follow up on open question to the business and solution team.
  • Troubleshoot and resolve data/code issue related with ETL components
  • Responsible for project delivery of the data masking program with a team of 7 members.

Environment: Informatica Powercenter 9.6, IDQ, MDM, DB2, Oracle, Flat Files, Windows XP, UNIX.

Confidential, Los Angeles, CA

Senior Informatica Developer

Responsibilities:

  • Involvement in requirement gathering with customer.
  • Troubleshoot and resolve data/code issue related with ETL components.
  • Developing Informatica mappings and workflows for extracts and feeds. Data is extracted through mappings from different sources like Relational database tables, VSAM files, etc
  • Involved in Government DHS Encounter submissions of 837I, 837P and NCPDP.
  • Using B2B - Data Transformation Studio enables you to import and easily customize prebuilt transformations for industry standards such as EDI, SWIFT, and HIPAA.
  • Using B2B, for 837 EDI files are comply with industry standards and regulatory requirements in real time
  • Involved in writing UNIX shell scripting for moving the mainframe vsam file land zone to informatica source folders.
  • Used shell scripting for breaking the files into small blocks for improving performance.
  • Working with Informatica power Warehouse Designer, Transformation designer, mapping designer and Workflow Manager to develop new Mappings & implement data warehouse.
  • ETL design performed using various informatica powercenter transformations like Source qualifier, expression, router, normalizer, various kinds of lookup’s like connected, unconnected, dynamic lookup’s for SCD type 2.
  • Used workflow manager for creating various tasks like session, event wait task, and command task.
  • Designed ETL processes for optimal performance.
  • Met Business demands by having direct client interactions, resource management, requirements gathering, analysis, design, development, change control and Implementation.
  • Exceptional background in analysis, design, development, customization, implementation and testing of software applications and products in all phases of the system development life cycle (SDLC)
  • Worked with application managers to prioritize the support issues from different applications and multiple projects in getting them resolved and implemented within SLA.
  • Analyzing the requirements and making functional specification by discussing with business user groups and translate the Business Requirements and documented source-to-target mappings and ETL specifications.
  • Worked extensively on Informatica partitioning when dealing with huge volumes of data and also partitioned the tables. Used Pass Through, Round Robin and Hash portioning.
  • Worked on command tasks, event wait tasks, event raise tasks, timer tasks to implement business logic

Environment: Informatica 9.1, Oracle 11g, Windows XP, UNIX.

Confidential

Senior Informatica Developer

Responsibilities:

  • Involved in Government DHS Encounter submissions of 837I, 837P and NCPDP.
  • Involved in building the data warehousing for maintain the policy information with snowflake schema.
  • Involved in creating dimension and fact tables.
  • Involved in System analysis, design, development, Data Warehouse/Data Mart design, RDBMS.
  • Strong in Reading and Writing XML through Informatica.
  • Involved in preparing XSD design and development using ALTOVA XML SPYWARE.
  • Involved in using XML Parser and XML generator transformation in Informatica.
  • Involved in using XSD design for various functionality like nodes, elements, complex type, and sequence occurrences like minimum and maximum occurrence for data elements.
  • Worked with Source Analyzer, Warehouse Designer, Transformation designer, mapping designer and Workflow Manager to develop new Mappings & implement data warehouse
  • Working with Informatica power Warehouse Designer, Transformation designer, mapping designer and Workflow Manager to develop new Mappings & implement data warehouse.
  • ETL design performed using various informatica powercenter transformations like Source qualifier, expression, router, normalizer, various kinds of lookup’s like connected, unconnected, dynamic lookup’s for SCD type 2.
  • Created Parameter files to pass database connections, parameter entries for source and target
  • Involved in Internal submissions like BHI, CRIB and Inovalon Submission for HIX project.
  • Involved in 24/7 production support for various ETL jobs (Informatica and Talent tools).
  • Lead the development team and co-ordinate the projects among the developers

Environment: Informatica 9.1, Teradata, BTEQ, Talent, Oracle, Windows XP, UNIX.

We'd love your feedback!