We provide IT Staff Augmentation Services!

Data Warehouse Tech Lead Resume

2.00/5 (Submit Your Rating)

Burlington, MA

SUMMARY

  • Data Warehouse Tech Lead with 8+ years relevant experience in ETL/ELT(talend/Informatica) Oracle Database, Shell Scripting(Python and Unix) and Reporting tools (cognos, crystal, jasper).Domain knowledge in Investment, insurance and Retail Marketing.
  • Lead the design, implementation and maintenance of data movement processes including system processes, process architecture, extract transform and load (ETL) mechanisms (Talend and Informatica), data integration, data migration and data load strategy.
  • Development of high performance queries accessing tables with 500M+ rows
  • Analyze data storage structures including, staging area design, operational data stores (ODS), data warehouse design, and data mart design
  • Consider what is required to satisfy data quality requirements including any specialized processes to cleanse data and report data quality issues.
  • Exhibits an understanding of the business goals; recognize the client needs and links needs to specific data and reporting solutions
  • Demonstrated excellence with development best practices & QA
  • Evaluate and document all data sources and integration touch - points, model and implement dimensional data structures.
  • Participate in the design and development of comprehensive ETL solutions
  • Develop, deliver and support enterprise Data Warehouse, Data Marts, and cubes
  • Highly proficient with SQL and T-SQL
  • Oracle golden gate for migrating the data between different datamarts.
  • Data integration experience involving complex, enterprise-class data sources
  • Very strong data modeling, SQL programming and ETL skills (Talend/Informatica including MDM and Integration).
  • Solid understanding of Data Warehouse/BI security, performance tuning, and optimization.
  • Experience in BI Reporting tools such as Cognos, Crystal; data modeling tools such as Erwin data Modeler, Toad and SQL developer.
  • Experience in UNIX shell scripting, Python Programming, PERL and JAVA.
  • Experience in Cron scheduler, control-M, Autosys, Tivoli.
  • Experience in Migration Project from SQL server to Oracle.
  • Experience in creating extracts file, File transfer through FTP on different server and Mailbox.
  • Active participant of a team including risk management, planning and scoping of projects
  • Accountable for code and documentation delivery of commitments throughout the iteration
  • Participate in code reviews and documentation reviews
  • Be open to learning new technologies and new problem domains
  • Able to react to change productively and handle other essential tasks as assigned
  • Assist in the onboarding and development of junior team members
  • Participate in the design and implementation of a resilient, end-to-end BI solution framework to ensure data quality, complete/accurate processing, error-handling, and logging for monitoring and auditing

TECHNICAL SKILLS

Languages: SQL, PL/SQL,C# 3.0, Unix Shell Scripting, Python

Oracle Tools: SQL Developer 4.1, TOAD 10.6, SQL* Plus, Oracle Golden gate

ETL Tools: Talend Studio 5.3,Informatica 9.x, SQL *Loader

Databases: Oracle 10g/11g.

Operating System: Windows 7/XP/NT/Vista, UNIX,LINUX

Data Modeling Tools: Toad Data Modeler 3.6, Erwin 7.x, MS Visio 2013, MS Powerpoint

Server: Unix, Informatica, Unica

Other Tools: TAC, File Zilla, Stash, Clearcase, Autosys, Cognos Reporting, WINSCP, Toad 10.6, Toad Data Modeler 3.6,Autosys,Jira, TFS,VSS, Putty, IIS, MS-Excel 2007/2010/2013, CodeGen,Tivoli,JRS,HPQC,EQUIP,PRISM

PROFESSIONAL EXPERIENCE

Confidential, Burlington, MA

Data warehouse Tech Lead

Responsibilities:

  • Lead the design, implementation and maintenance of data movement processes including system processes, process architecture, extract transform and load (ETL) mechanisms (Talend), data integration, data migration and data load strategy.
  • Active participant of a team including risk management, planning and scoping of projects.
  • Evaluate and document all data sources and integration touch-points, model and implement dimensional data structures, participate in the design and development of comprehensive ETL solutions.
  • Development of high performance queries accessing tables with 500M+ rows
  • Analyze the Performance of batch jobs and suggest and implement process improvements.
  • Expert in creating complex SQL Queries, Sub Queries, complex Queries, PL/SQL Packages, Functions, Stored procedures, Analytical and Aggregate functions.
  • Use materialized views, Index Organized Tables, indexes, partitions and applying various hints for performance Tuning.
  • Cron job for scheduling the daily, weekly and monthly jobs.
  • Use Talend components for data cleaning, data validation, data mapping, data loading of various files into staging database objects.
  • Generating, deploying and Publishing Talend Jobs on tac.
  • Oracle Enterprise manager 12c for monitoring the jobs.
  • Created various file extracts and use WinSCP, File zilla for file transfer between different servers.
  • SQl loader for direct and parallel load of data from raw file into database tables.
  • Virtual private database to implement a layer of security on data.
  • Cognos Reporting tool for generating various reports on data quality, best sales, number of events and data validation reports
  • Unix shell script and Python Programming for file validation, file manipulation and file conversion.
  • SQL loader for loading files from unix server to oracle database objects.
  • Perform matching, merging on incoming files through Talend MDM based on various matching rules.
  • Performing complex aggregation logic to find customer related information, best address, best phones and best emails.
  • Use Erwin data modeler for designing LDM and PDM with constraints and keys.

Environment: Oracle database 10g/11g, Talend 5.3, SQL Developer 4.x,Toad 11.x, Toad Data Modeler 3.6,Autosys,Filezilla,Putty,Jira, Unix Server, MS Visio, ERWIN, Winscp, Oracle Enterprise manager 12c

Confidential, Merrimack, NH

Database Engineer

Responsibilities:

  • Work on Data contribution by creating database objects like tables, views, procedures, cursors, triggers, types, sequences, packages, functions using Toad and SQL Developer.
  • Carried out Data adoption by tuning complex SQL queries using Explain Plan, Indexes, Collections, and Bulk Operations. Use Partitions, Oracle Hints for optimal query performance.
  • Created conceptual, logical and physical model and review it with data analysts and DBA’s to ensure adherence to the standards.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner and Stored procedure transformations.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Implemented SCD1, SCD2 type maps to capture new changes and to maintain the historic data.
  • Worked on Matching, merging, consolidation of data through Informatica MDM using MDM hub.
  • Scheduling Data integration job using Autosys and Control-M.
  • Co-ordinate with UI developers to call AIPs through their services.
  • Work on utplsql for testing.
  • Work on migration of data from CDE to ODS to Warehouse using Informatica workflows and mappings.
  • Use of Jil files and applyIM to install changes in development and production environment.
  • Oracle golden gate for migrating the data from MDM to CDE.
  • Use of stash for version control, merging changes from development folder to master folder.
  • Adopted agile methodologies and involve in the daily scrum meeting. Work on sprint created by scrum master in JIRA.
  • Make use of Putty to login to Informatica and Unix server.
  • Code generator to generate the packages of similar kinds.

Environment: Oracle database 10g/11g, Informatica 9.1.0, SQL Developer 4.x,Toad 10.6, Toad Data Modeler 3.6,Autosys,Stash,Jira, Unix Server, MS Visio, Informatica Server, Oracle Golden Gate.

Confidential, Northbrook, IL

Oracle PL\SQL Developer

Responsibilities:

  • Work with business stakeholders, application developers, and production teams and across functional units to identify business needs and discuss solution options
  • Created database objects like tables, views, procedures, packages using Oracle Utilities like PL/SQL, SQL* Plus and Handled Exceptions
  • Excellent working knowledge of Oracle database architecture (column orientation, projections, pivoting, segmentations, partitions, high availability and recovery, security)
  • Implementation experience in relational and dimensional modeling, star/snowflake schema
  • Extensively use Unix Shell Scripting for creating scripts to call Oracle stored procedures and functions.
  • Created complex mappings in Power Center Designer using Aggregate, Expression, Filter, and Sequence Generator, Update Strategy, Union, Lookup, Joiner and Stored procedure transformations.
  • Worked on Power Center Tools like designer, workflow manager, workflow monitor and repository manager.
  • Worked on extract and load type of mappings.
  • Worked with command line program pmcmd to interact with the server to start and stop sessions and batches, to stop the Informatica server and recover the sessions.
  • Developed Unix shell scripts to schedule Informatica sessions.
  • Used Tivoli Job Scheduler for scheduling and monitoring jobs and its successor. Setting the various jobs in Production and UAT environment.
  • Resolving abended jobs using application log file.
  • Performance tuning, troubleshooting and bug fixing of PL/SQL scripts while working closely with the UI developers.
  • Maintaining quality related documents like Project Plan, PSSP, DR-BCP, RAT Register etc.
  • Extensively usedErwin7.x fordata modelingandDimensional Data Modeling
  • Production support for the backend database and reporting applications.
  • HPQC for logging defects and executing test cases.
  • Carried out extensive data validation with client's data specialist validating complex SQL queries.
  • Discussions with business groups about the data quality results and future steps to develop business rules to ensure efficient data quality check and validation.
  • Designed the documentation that involves functional specification, user Manual and technical review.

Environment: Oracle database 10g/11g, PL/SQL, SQL, Informatica 8.x,Erwin 7.x, Telnet, Unix Shell Scripting, Toad, Team Foundation Server, Tivoli, JRS, HPQC.

Confidential

Oracle PL\SQL Developer

Responsibilities:

  • Analyzed Internal & External Source systems and Business Requirements to develop Health Insurance Claims Data warehouse Logical and physical model (Facts and Dimensions) and Business Process Model.
  • Applying Data Modeling techniques using ERD and Data Flow diagram.
  • Developed back end interfaces using Tables, PL/SQL packages, stored procedures, functions, Collections, Object types specifically for financial reporting.
  • Tuning the SQL Queries for better performance and troubleshooting development problems by using Hints, Indexes, Partitioning and analyzing the Stats updates.
  • Implemented Business Rules using PL/SQL Procedure/Packages with complex water fall calculations.
  • Unix Shell scripting for supporting nightly batch schedules.
  • Data Migration and Production support for the backend database and reporting applications.
  • Importing Source Tables from the different databases.
  • Performance testing, User Acceptance testing of the application and participated in code and Design Reviews, Status meetings and Walkthroughs.
  • Involved in continuous enhancement, optimization and fixing of problems.

Environment: Oracle PL/SQL, SQL, 10g/11g, Oracle Reports 9i, Windows NT/2000, Linux, Oracle (9i), PL/SQL scripting, MS Visio, UNIX Shell Scripting Visual Source Safe.

Confidential

Application Programmer

Responsibilities:

  • Worked with Business Analysts and clients, analyzing and assessing the business specifications and designing and developing the Application.
  • Developed new/modified various PL/SQL packages, stored procedures, functions, database triggers as per business requirements.
  • Used TOAD, PL/SQL developer tools for faster application design and developments.
  • Involved in performance tuning, troubleshooting and bug fixing of PL/SQL scripts while working closely with the UI developers.
  • Responsible for translating all the business requirements into technical specifications and document the technical requirements towards ensuring compliance during Software Development.
  • Involved in writing complex scripts for Data Transformation, ETL process (Extract, Transform & Load)
  • Performed the uploading and downloading flat files from UNIX server using FTP.
  • Applying Data Modeling techniques using ERD and Data Flow diagram.
  • Worked on Performance tuning by using Explain Plan, hints and also worked on Partition tables using Range method.
  • Involved in Production support for the backend database and reporting applications
  • Involved in writing PL/SQL scripts which will generate auditing triggers automatically.

Environment: Oracle 10g, SQL, PL/SQL, SQL Server, Unix Shell Scripting, Data Modeling using Visio, Microsoft Excel.

We'd love your feedback!