We provide IT Staff Augmentation Services!

Data Warehouse Consultant/greenplum/postgres Developer Resume

3.00/5 (Submit Your Rating)

Charlotte, NC

SUMMARY:

  • Over 7 years of IT experience in RDBMS & Data warehousing, experience in design, development, testing, implementation and support of enterprise data warehouse.
  • Experienced in Requirement Analysis, Test Design, Test Preparation, Test Execution, Defect Management, and Management Reporting.
  • Experienced in developing Data Mapping, Performance Tuning and Identifying Bottlenecks of sources, mappings, targets and sessions.
  • Strong understanding of Data Modeling in data warehouse environment such as star schema and snow flake schema.
  • Hands on experience in developing PL/SQL, Stored Procedure and UNIX Scripting.
  • Experienced in BI Reporting tools such as Business Objects, COGNOS and Oracle Discoverer.
  • Experienced in development of ETL mapping and scripts.
  • Experienced with Batch systems scheduling and processing.
  • Strong understanding of data quality assurance processes and procedures.
  • Experienced in software development life cycle (SDLC).
  • Experienced working with offshore vendors and establishing offshore teams and processes.
  • Skilled at learning new concepts quickly, can work well under pressure and able to communicate ideas clearly and effectively.
  • Excellent team player with an ability to perform individually, good interpersonal and analytical skills
  • Facilitated JAD sessions and Interviews.

TECHNICAL SKILLS:

Technologies: Microsoft .Net,Web Services (SOAP, WSDL, UDDI)

Languages: Visual C#, Visual Basic

Scripting: JavaScript, HTML, DHTML, CSS, XML, XSL, XSLT

App Server: Microsoft IIS 7.0/IIS 6.0/IIS 5.0, Microsoft Server 2003, Windows

Databases: MS SQL Server 2008/2005/2000, MS Access, Oracle 11i/10g/9i

MySql Tools: Informatica,Teradata, Passport,Visual Studio 2008/2005, SharePoint 2007, iRise, Telelogic DOORS, Rational Suite (RequisitePro, ClearQuest, Clearcase), Rational Rose, Axure Pro, Blueprint, CaliberRM, Enterprise Architect, Team Foundation Server, Business Objects XI, Cognos BI Studio

Applications: MS Office Suite 2007, MS Project, MS Visio, VMware, FileNet P8, Documentum 6.x, Ftp95 Pro

OS: Windows Vista x64/x86 Enterprise, Windows XP Professional x64/x86, Linux, Unix

PROFESSIONAL EXPERIENCE:

Confidential, Charlotte, NC

Data Warehouse Consultant/Greenplum/Postgres Developer

Responsibilities:

  • Developed T - SQL Stored Procedures and Triggers and User defined Data Types and Functions to enforce the business rules.
  • Created PowerShell scripts to update JBoss server components and database objects.
  • Subject matter Expert and administrator for the Tidal Enterprise Scheduler. Manage the application and patching along with creating guidelines and best practices for building jobs.
  • Derived the detailed Business, Technical, Security, Data, and Metadata Architecture design for the EDW (up to a Terabyte) with multiple data marts ranging in size from 200-400 GB's each.
  • Scheduled jobs of SSIS packages to ETL daily transactions from company's ERP system OLTP database to OLAP Data Warehouse.
  • Created Triggers to maintain log history of all tables and major changes to the existing production databases.
  • Customized the newly implemented netezza framework (NZDIF) for the project needs.
  • Developed ELT scripts to load data warehouse on Netezza 6.0.
  • Worked extensively on the netezza framework on Linux platform.
  • Implemented CDC for dimension tables by leveraging the Netezza framework.
  • Developed complex nzsql & nzload scripts.
  • Worked with data Extraction, Transformation and Loading data using BTEQ, Fast load, Multiload from Oracle to Teradata.
  • Automated printer creation process across four servers with custom tray settings utilizing Powershell script.
  • Used the Teradata fast load/Multiload utilities to load data into tables.
  • Used Teradata SQL Assistant to build the SQL queries.
  • Used analytical and Windowing functions of Netezza to implement complex business logic.
  • Worked on overall performance improvement of data loads by leveraging Netezza MPP architecture & following Netezza best practices.
  • Worked with BI team to understand reporting requirements and created materialized views to improve report's performance.
  • Working with Teradata ETL tools like Fast Load, Multi Load, Tpump and Fast Export.
  • Used Teradata system Priority Schedule in controlling the load of the system.
  • Converted hundreds of Teradata syntax macros to Greenplum/Postgres syntax functions using SQL & PL/PGSQL.
  • Involved in Relational Data Modeling (normalization, Relations) and multi-dimensional modeling using De-normalization, Fact tables, Dimension tables, Star schema for OLTP/OLAP systems.
  • Involved in Database design process and Data modeling process using Microsoft Visio 2007.
  • Converted Teradata syntax tables, views, functions and stored procedures to Greenplum/Postgres syntax tables, views, functions using SQL & PL/PGSQL.
  • Implemented a process to establish Referential Integrity between related dimension/fact tables.
  • Used Fastexport to extract data from Teradata and bring it into landing area of Netezza.
  • Used FastLoad, Mload & Bteq import to load data from Netezza to Teradata as part of data back feed.
  • Used named pipe on Linux for loading data between databases, Teradata to Netezza and vice versa.
  • Implemented point to point controls at every data hop to ensure data quality.
  • Worked on setting up DR (Disaster Recovery) environment and tested DR scenario.
  • Implemented Data validation at summary level and detailed level between existing data model on Teradata and the new data model on Netezza.
  • Involved with the team in Dimensional modeling of the Data warehouse and studied Erwin R9 business process, dimensions and measured facts design.
  • Worked on incremental replication of data mart from load box to distribution box for MicroStrategy reports & for downstream applications.
  • Implemented version control of code base and migration of code using Perforce & SVN.
  • Created Autosys jobs to schedule the data flows at each layer of the data warehouse.
  • Worked with offshore teams regularly at every stage of the project.

Environment: Agile Product Lifecycle Management (PLM) for Process 6.0, SSIS, SSRS, SSAS, Tidal Scheduler 5.3, Pentaho,Oracle 11g,Powershell,Oracle Business Intelligence Enterprise Edition (OBIEE) 10.1.3.x, OBIEE (11.1.1.6.9 ), Oracle Product Lifecycle Analytics (OPLA) 3.0., GreenPlum, Postgres.

Confidential, Moosic, PA

Data Analyst

Responsibilities:

  • Responsible for defining the scope and business rules of the project, gathering business requirements, and document them textually or using models. Interacted with cross functional teams to facilitate gathering of business requirements.
  • Worked closely with a project team for gathering business requirements and interacted with business users to translate business requirements into technical specifications.
  • Analyzed the functional specs provided by the data architect and created technical specs documents for all the mappings.
  • Created data flow diagrams, data mapping from Source to stage and Stage to Target mapping documents indicating the source tables, columns, data types, transformations required and business rules to be applied.
  • Created Test cases and Test scripts for System Integration Testing (SIT), User Acceptance Testing (UAT) and Unit Integration Testing.
  • Designed Informatica mappings such as pass through, split, Type 1, Type 2 and used various transformation such as Lookups, Aggregator, Ranking, Mapplets, Connected and Unconnected lookup, SQL overrides etc.
  • Implemented CDC (change data capture) for inserting and updating slowly changing dimension tables in target for maintaining the history data.
  • Validated ETL mappings and tuned them for better performance and implemented various Performance and Tuning techniques.
  • Validated Job flows and dependences used by TIDAL scheduler to run informatica workflows and other shell scripts and store procedures.
  • Extracted large volumes of data feed from different data sources, performed transformations and loaded the data into various Targets.
  • Performed Unit Testing at various stages by checking the data manually.
  • Performed Data Analysis and Data validation by writing SQL queries.
  • Participated in the Incident Management and Problem Management processes for root cause analysis, resolution and reporting.

Environment: Rational Unified Process ( RUP ), Rational Suite (RequistePro, ClearQuest, Clearcase), MS Office Suite, MS Visio, MS Project, Oracle 9i, Mercury Test Director, MS SQL Server 2005,Rational Rose, MS Access, MS SharePoint, Business Objects

Confidential, San Francisco,CA

Data Warehousing Analyst

Responsibilities:

  • Involved in analysis of end user requirements and business rules based on given documentation and worked closely with tech leads and Business analysts in understanding the current system
  • Assisted Business Analyst in documenting business requirements, technical specifications and implementation of various ETL standards in the mappings
  • Analyzed the business requirements and involved in writing Test Plans and Test Cases.
  • Involved in the design of Data-warehouse using Star-Schema methodology and converted data from various sources to oracle tables.
  • Developed views, functions, procedures, and packages using PL/SQL & SQL to transform data between source staging area to target staging area.
  • Wrote SQL queries to perform Data Validation and Data Integrity testing.
  • Created SQL*Loader scripts to load legacy data into Oracle staging tables.
  • Created staging tables necessary to store validated customer data prior to loading data into customer interface tables.
  • Coordinated with the Business Objects developers in building universes and developed various reports like Cross Tab, Master Detail and different charts including Line, Column, Area, and Pie Charts for analysis
  • Developed UNIX shell scripts to run the batch jobs.
  • Documented the process procedures and flow for the process

Environment: Rational Unified Process ( RUP ), Rational Suite (RequisitePro, ClearQuest, Clearcase),MS Visio, MS Project, Oracle 9i, Mercury Test Director MS SQL Server 2005, Rational Rose, iRise, TOAD, Cognos 8 BI Studio

Confidential, Los Angeles, CA

Data Warehousing Analyst

Responsibilities:

  • Provided 24x7 support for production operations (incident break/fix, change, service request, project, and databases)
  • Handled tickets using Incident management through peregrine service center.
  • Ensured all production changes complied with change management policies and procedures.
  • Developed a Troubleshooting Documents which listed all the support related activities and day-to-day issues encountered every day.
  • Coordinated with all the concerned parties (Database team/Project Teams/Application Team) in trouble shooting issues and worked with P1 team for major business critical production issues.
  • Monitored and scheduled highly sensitive data e.g. Electronic Wire Funding, HELOC Direct check, FDR Direct check, Lockbox to ensure all steps are executed successfully.
  • Responsible for scheduling, running and monitoring autosys batches.
  • Designed and developed several mappings to load data from Source systems to ODS and then to Data Mart.
  • Involved in Fine-tuning of Informatica mapping, Stored Procedures and SQL to obtain optimal performance and throughput.
  • Created and maintained several custom reports using Business Objects
  • Ensured that all production changes were fully documented, supportable and captured in the Enterprise configuration management system ( ECMS )
  • Maintained/updated system data flow chart, Visio documents, and system documentations
  • Provided on-call and pager support during off hours and weekends.

Environment: Rational Unified Process ( RUP ), Rational Suite (RequisitePro, ClearQuest, ClearCase), MS Visio, MS Project,, Windows XP, Oracle 9i, Mercury Test Director MS SQL Server 2005, Rational Rose, MS Access, iRise, Business Objects, Cognos BI Studio

We'd love your feedback!