We provide IT Staff Augmentation Services!

Etl/big Data Architect Resume

5.00/5 (Submit Your Rating)

Jersey City, NJ

SUMMARY:

  • Extensive experience in IT industry with good exposure in data modeling, data architect, solution architect, data warehousing & business intelligence concepts and master data management (MDM) concepts.
  • Informatica Certified Developer & Informatica Certified Data Quality Specialist
  • Strong Knowledge of all phases of Software Development Life Cycle and have worked in large scale projects with incremental deliverables
  • Experience in OLTP Modeling (2NF, 3NF) and OLAP Dimensional modeling (Star and Snow Flake) using ERwin (conceptual, logical and physical data models)
  • Good experience in Big Data tools - Cloudera Hadoop, Sqoop, Hive, Impala, Spark
  • Extensive knowledge in architecture design of Extract, Transform, Load environment using Informatica Power Center
  • Experience in implementing the complex business rules by creating Informatica transformations, re-usable transformations (Lookup (Connected and Un-connected), Joiner, Union, Sorter, Aggregator, Rank, Normalizer, Filter, Router and Update Strategy), and developing complex Mappings & Mapplets
  • Experience in data quality tools like Informatica Data Quality (IDQ)
  • Profiled the data from disparate source systems using Informatica Data Explorer (IDE)
  • Imported metadata from different sources such as Relational Databases, Informatica mappings to build data lineage using Informatica Metadata Manager
  • Experience in integration of various data sources definitions like SQL Server, Oracle, Sybase, ODBC connectors & Flat Files
  • Expertise in debugging and performance tuning of Informatica mappings and sessions & SQL stored procedures
  • Experience with relational databases such as Oracle 8i/9i/10g/11g, SQL Server, Sybase, MS Access
  • Strong skills in SQL, PL/SQL packages, functions, stored procedures & triggers
  • Hands on experience in UNIX shell scripting
  • Experience in working with various scheduling tools (Control M scheduler & CA scheduler)
  • Team Player; Excellent Interpersonal skills; Ability to work effectively with different stakeholders in a project

TECHNICAL SKILLS:

Core Competency: Data modeler, Data architect, ETL solution architect, Master Data Management (MDM) specialist, Business Intelligence (BI) reporting

Operating System: Windows, UNIX

Data warehouse Utilities: Informatica, Informatica Data Quality, Informatica Data Explorer, Informatica Metadata Manager, Oracle Warehouse Builder, Business Objects, Crystal Reports

Databases: Oracle, SQL Server, Sybase, MS Access

Database Tools: Embarcadero Rapid SQL, Embarcadero DBArtisan, TOAD

Big Data Tools: Hadoop, Sqoop, Hive, Impala, Spark

Schedulers: Control M scheduler, CA scheduler

Programming Languages: Visual Basic, ASP, Java Script, VB Script, HTML

Other Tools: MS Office, MS Project Plan, MS Visio, SharePoint

PROFESSIONAL EXPERIENCE:

Confidential

ETL/Big Data Architect, Jersey City, NJ

Responsibilities:

  • As a lead architect, I led designers and other developers in the team to guide them and help providing right technical solutions to the business.
  • Set up the Cloudera platform and various tools (Sqoop, Hive, Impala, Spark).
  • Migrated the existing data to the Big Data platform using Sqoop.
  • Created Hive tables and Impala metadata to access the data from HDFS.
  • Redesigned the existing Informatica ETL mappings & workflows using Spark SQL.
  • Lead various business initiatives and BAU functions in the existing data warehouse application.
  • Supported the daily/weekly ETL batches in the Production environment
  • Prompt in responding to business user queries and changes.

Environment: Informatica, Sybase, Linux, CA Scheduler, Hadoop, Sqoop, Hive, Impala, Spark

Confidential

ETL/Big Data Architect, New York, NY

Responsibilities:

  • Performed business analysis, requirements gathering and converted them into technical specifications
  • Architected all the ETL data loads coming in from the source system and loading into the data warehouse
  • Built a robust client reporting platform that streamlines generation and delivery of client reports for Confidential Americas clients
  • Developed Informatica Sessions & Workflows using Informatica workflow manager
  • Optimized the performance of the Informatica mappings by analyzing the session logs and understanding various bottlenecks (source/target/transformations)
  • Used Informatica debugging techniques to debug the mappings and used session log files and bad files to trace errors that occur while loading
  • Implemented all the data quality rules in Informatica data quality.
  • Imported metadata from different sources such as Relational Databases, Informatica mappings to build data lineage using Informatica Metadata Manager
  • Involved in Oracle PL/SQL query optimization to reduce the overall run time of stored procedures
  • Created UNIX shell scripts to invoke the Informatica workflows & Oracle stored procedures
  • Created UNIX shell scripts to file move, file archive & FTP data files to other downstream applications.
  • Designed Control M scheduler jobs to invoke the UNIX shell scripts
  • Involved in unit testing of various objects (Informatica workflows/Oracle stored procedures/UNIX scripts)
  • Migrated large volume of PB data warehouse data to HDFS.
  • Developed Spark jobs to transform the data in HDFS
  • Supported various testing cycles during the SIT & UAT phases.
  • Involved in creation of initial data set up in the Production environment and involved in code migration activities to Production.
  • Supported the daily/weekly ETL batches in the Production environment
  • Prompt in responding to business user queries and changes.

Environment: Informatica, Informatica Data Quality, Informatica Metadata Manager, Oracle 11g, Rapid SQL, UNIX, Control M Scheduler, Hadoop, Sqoop, Hive, Impala, Spark

Confidential

Informatica Architect, New York, NY

Responsibilities:

  • Performed business analysis, requirements gathering and converted them into technical specifications
  • Designed the MIS data mart (star schema dimensional modeling) after analyzing various source systems and the final business objects reports
  • Architected all the ETL data loads coming in from the source system and loading into the MIS data mart
  • Designed all the slowly changing dimensions to hold all the history data in the data mart
  • Developed all the ETL data loads in Informatica Power Center to load data from the source data base into various dimensions and facts in the MIS data mart
  • Implemented Slowly Changing Dimensions (Type 2) while loading data into dimension tables to hold history
  • Created reusable transformations, Mapplets and used them in the mappings and workflows
  • Developed Sybase stored procedures to load data into some of the fact tables
  • Designed CA scheduler jobs to invoke the UNIX shell scripts
  • Involved in unit testing of various objects (Informatica workflows/Sybase stored procedures/UNIX scripts)
  • Supported various testing cycles during the SIT & UAT phases.
  • Involved in creation of initial data set up in the Production environment and involved in code migration activities to Production.
  • Supported the daily/weekly ETL batches in the Production environment
  • Prompt in responding to business user queries and changes.

Environment: Informatica, Sybase, DB Artisan, UNIX, CA Scheduler

Confidential

Solution Architect, Chicago, IL

Responsibilities:

  • Worked closely with the Business Partners and Source System SMEs to finalize on the data requirements and ETL requirements
  • Designed robust customer centric data warehouse to provide single view of customer information across the enterprise
  • Profiled the data from various source systems and enriched and augmented the data using various data cleansing and data quality tools and techniques.
  • Standardized various address formats and address information and validated the correctness of address against the US postal database and other global address databases
  • Standardized various forms of names into a single format
  • Seamlessly integrated the application with Informatica data quality tool for addressing the data quality issue and standardizing the data formats.
  • Matched common counterparties and legal entities using the third-party tool and assigned an unique legal entity ID for the related counterparties and legal entities
  • Integrated the application with the third-party data service provider (D&B) to identify the hierarchy of the legal entity.
  • Instrumental in end to end implementation of complex MDM solution.
  • Led Business Objects reporting team to implement analytical functionalities like Drill Down, Slice & Dice, Drill Through

Environment: Informatica, Informatica Data Quality, Oracle 10g, Business Objects XI R2, Rapid SQL, UNIX, Control M Scheduler

Confidential

Lead Developer, Houston, TX

Responsibilities:

  • Monitor service queue and respond to problem tickets in Remedy.
  • Involved in analysis of various problem tickets and identify the root cause of the issue
  • Fixed various issues based on the analysis within the stipulated SLA time period
  • Attended the weekly status calls with the business to give a precise status on various ongoing activities
  • Prepared technical design documents for various small work items (SWI)
  • Involved in application effort estimation and managing a team of developers
  • Written PL/SQL queries to extensively test the data between source and target applications.

Environment: Oracle 10g, SQL Server, Visual Basic, ASP, Crystal Reports 8.5

Confidential

Developer, Melville, NY

Responsibilities:

  • Involved in Analysis of business requirements and preparation of high level design documents (HLD).
  • Involved in designing the technical architecture & program specifications of the application
  • Developed Web based interfaces and wrappers for backend business objects using HTML, JavaScript, VB script, ASP
  • Developed various Oracle stored procedures to handle the business logic
  • Designed various MIS reports, commission statement reports, month end reports and year-end reports using Crystal Reports 8.5
  • Involved in the system performance load testing.

Environment: Oracle 10g, ASP, VB Script, Java Script, HTML, IIS 5.0, Crystal Reports 8.5

We'd love your feedback!