We provide IT Staff Augmentation Services!

Data Integration Developer Resume

5.00/5 (Submit Your Rating)

Peapack, NJ

SUMMARY

  • Extensive experience designing and developing ETL processes using Informatica and Ab Initio.
  • Complete SDLC experience in requirement gathering, analysis, designing, testing, integration, quality assurance, maintenance, and support.
  • Expert SQL experience analyzing and profiling large volumes of desperate data to address complex data integration issues .
  • Experience designing and developing reporting solutions using tools like Tableau, QlikView and Cognos, and Crystal Reports.
  • Significant experience implementing data quality management solutions to monitor and capture data related issues.
  • Hands - on exposure to HDFS, MapReduce, Pig, Hive, Impala, HBase and Sqoop.
  • Experience in Data Modeling, Data Mapping, Data Transformation, and MDM.

TECHNICAL SKILLS

Programming Languages: PL/SQL, SQL, XML, HTML, JSP, VBA, FoxPro

ETL Tools: Talend, Ab Initio, DataFlux, Informatica PowerCenter

Big Data: Hadoop/HDFS Architecture, Hive, Impala, Sqoop, Pig, Cloudera CDH5.5

Development Tools: SQL Developer, SOAP UI, Eclipse, SQL Navigator, Erwin, TOAD, SQL Loader, ClearCase, SQL*Plus, Visual Basic, PVCS

Reporting Tools: Tableau, QlikView, Crystal Reports, Crystal Enterprise, Cognos Reports, Cognos Framework Manager, Cognos Impromptu

Data Quality Tools: Ab Initio , Informatica IDQ, DataFlux

Data Modeling Tools: Erwin, Embarcadero ER Studio, MS Visio

Databases: Teradata, Oracle-11g, Sybase 11, SQL Server, MS Access

LMS Applications: SuccessFactors, Plateau, ISOtrain, Saba, SumTotal, Active Learner, TRIM, Registrar

Operating Systems: Linux, Unix, Windows, MacOS

PROFESSIONAL EXPERIENCE

Confidential, Peapack, NJ

Data Integration Developer

Technologies used: Informatica PowerCenter, SuccessFactors/Plateau, ISOtrain, Oracle 11g, PL/SQL, Stored Procedures, Packages, Erwin

Responsibilities:

  • Designed and reviewed Informatica ETL data mappings to migrate legacy LMS data from ISOtrain to SAP's SuccessFactors.
  • Created Hive tables on the Enterprise Data Lake to populate the data in HDFS to conduct data analysis and data profilin.
  • Worked with the SME's to develop the data migration design specs.
  • Maintained the metadata, data dictionary, and data model. Provided oracle performance tuning assistance where needed.
  • Provided research and resolution reported in UAT's as well as coordinated and prioritized outstanding UAT defects and migration enhancements based on business requirements.
  • Created the SQL test scripts based on the design specs.
  • Provided technical support including SQL and Unix interface scripts for data cleansing and data preparation for the PLS upgrade project to the cloud.
  • Developed the ISOtrain decommissioning SQL scripts.
  • Coordinated development and analysis activities with the offshore team.

Confidential, Berkeley Heights, NJ

Senior ETL Developer

Technologies used: Ab Initio, QlikView, Cognos Reports, Cognos Framework Manager, Oracle 11, SQL Developer, PL/SQL, Stored Procedures, MS Access, Linux, SQL*Loader, SQL Server, Unix

Responsibilities:

  • Worked with the ETL developers and Data Modelers to set ETL standards and reusable data quality rules to help ensure data integrity across ETL implementations.
  • Designed and developed Ab Initio graphs to optimize and replace complex PL/SQL stored procedures used to extract, transform, and load international underwriting data to the Global Risk Aggregation Data Warehouse.
  • Developed PL/SQL programs and scripts to harness international underwriting data to aggregate the global risk exposure.
  • Technical lead on the design and development of data quality applications to dynamically profile and validate on boarding sourcing data.
  • Worked with business analysts to define, design, and create technical documentation on the ETL design, daily loads, transformations, and mappings.
  • Worked on the design and development of a Scenario Generation model by working with business analyst to better understand risk aggregation exposure and consolidating of risk exposure limits.
  • Generated new QVD files from the Global Risk Aggregation Data Warehouse by writing scripts in QlikView.
  • Designed and developed QlikView sheet objects including Pivot, List, Multi box, multiple charts types, Trends, custom requests for Excel Export, and Fast Change and objects for Management Dashboard reporting.
  • Involved in prototyping reports in QlikView.
  • Designed and developed Cognos Framework Manager data models and Cognos Reports.

Confidential, Nassau Park, NJ

Data Quality Architect

Technologies used: DataFlux, SQL Navigator, ERwin, SAP R/3, SQL Server Oracle 10g, PL/SQL, Stored Procedures, MS Access, SharePoint, Remedy, HPQC, Linux, SOAP UI, Eclipse, PVCS, Autosys

Responsibilities:

  • Designed and developed reusable business rules using DataFlux to generate key metrics for data quality reporting of product data.
  • Modeled and designed the monitoring and reporting database framework using ERwin.
  • Developed ETL DataFlux programs to de-duplicate, cleanse and match records.
  • Conducted data profiling and data analysis in complex business systems using DataFlux to identify data anomalies.
  • Developed data quality rules around the MDM product master remediation project.
  • Performed interface and code reviews of the ETL programs for project standards, data policies, overall product quality and completeness.
  • Developed a Customer matching program within DataFlux to go against the Customer Exchange Hub and identify the master records to be loaded into a new Enterprise Contract Management application.
  • Developed Unix shell scripts to call and manage DataFlux jobs.

Confidential, Bridgewater, NJ

Senior Informatica Developer

Technologies used: Informatica PowerCenter, DataFlux, Oracle 10g, PL/SQL, Stored Procedures, MS Access, SharePoint, WebLogic, Apache, SQL Loader, SQL Server 2000, XML, MS Project

Responsibilities:

  • Worked closely with the MDM team to define and provide strategy for item harmonization, cleansing, enrichment, and de-duplication prior to loading items into Oracle PIM release 12.
  • Developed data conversion programs using PL/SQL to handle complex business rules and Oracle's key flex fields in preparing Item data for staging tables.
  • Worked closely with the business process teams to identify the business rules to prepare the Extract, Transform, and Load (ETL) functional and technical design specifications for the Item Master component of the MRP data conversion.
  • Designed and developed Informatica Mappings to load data from source systems to staging tables and then to target tables.
  • Designed the ETL strategy in the form of mappings, transformations, sessions and workflows based on the business requirements to convert legacy manufacturing data to Oracle’s E-Business Suite.
  • Developed a PL/SQL interface to supply and retrieve records from SharePoint to record Automated Attendance entries as Training History into Plateau.
  • Lead the efforts to develop the data migration strategy, execute trial runs, and conduct data validation and final migration to Plateau.
  • Developed and maintain PL/SQL data migration scripts and packages used to migrate legacy LMS data from Active Learner, Trim, Registrar, and spreadsheets to Plateau.
  • Used DTS packages and MS Access to extract and transform various training records file formats to Plateau.
  • Developed custom XML Plateau reports based on Confidential Global Manufacturing requirements.
  • Developed testing scripts for data migrations and functional requirements.
  • Assessed the data migration gaps based on Confidential Global Manufacturing’s current Plateau migration gateway and SumTotal’s LMS migration package.

Confidential, New York, NY

Senior Crystal Reports Developer

Technologies used: Oracle 9i, Sybase, PL/SQL, Stored Procedures, Packages, Crystal Reports 10, Crystal Enterprise, OLAP, ClearCase, Test Director

Responsibilities:

  • Designed and developed complex PL/SQL procedures, using exception handling and cursor loops to compare FIX messages to trade and volume activity for a new electronic trading system.
  • Proposed, designed, and developed OLAP cubes to deliver trade volume metrics.
  • Developed Crystal Reports and stored procedures for the Exchange’s trading systems.
  • Published and scheduled reports using Crystal Enterprise.
  • Worked closely with developers, business analysts, quality assurance, customer service, and management to review requirements, resolve database reporting issues, and develop and deploy new solutions.
  • Tuned SQL queries to improve the reporting performance of the Data Warehouse.
  • Generated ad-hoc SQL requests for senior management and the customer service team.

Confidential, New York, NY

LMS Systems Analyst

Technologies used: Saba, Oracle 9i, PL/SQL, Stored Procedures, Triggers, Crystal Reports, JSP, VBA, Erwin, Perl, MS Project, Visio, Erwin, MS Access

Responsibilities:

  • Collaborated with the Saba consulting team to customize and implement a Learning Management System for the McKinsey Learning Group.
  • Created views, PL/SQL stored procedures, and triggers in developing the Saba notifications and reports.
  • Assisted the development team with database related issues such as SQL tuning and the business sponsors with ad-hoc SQL queries.
  • Designed and developed custom web reports using Crystal Reports.
  • Designed and documented a migration strategy for loading the legacy learning data and daily employee information to Saba and the learning history back to the ERP system (PL/SQL).
  • Integrated the McKinsey learning portal into the Saba database architecture.
  • Created test plans, test procedures, data flow diagrams (Visio), and technical support documentation for the QA and offshore development teams.
  • Managed and resolved application, web, back-end server, and database technical issues.
  • Built management reports using Excel VBA and JSP pages for the Firm Learning Group.
  • Worked with DBA’s to create the logical and physical data models.
  • Developed test cases and conducted unit testing using black and white box testing methodologies on McKinsey's ERP system.
  • Analyzed and optimized adhoc SQL views for the Human Resource, Training and Engagement modules of the ERP system.

We'd love your feedback!