We provide IT Staff Augmentation Services!

Sql/bi Etl Developer Resume

0/5 (Submit Your Rating)

Alpharetta, GA

SUMMARY

  • ETL/BI Developer with 8+years of overall professional experience with acquired skills, intellectual curiosity, and the ability to mine hidden gems located within large sets of structured, semi - structured, and unstructured data.
  • 6.5+ years of experience in Data Warehousing, ETL, and Business Intelligence experience using PentahoBI Suite (PentahoData Integration/Kettle,PentahoBI Server) & SSIS.
  • Experience in designing, performance tuning, and analysis, implementing processes using ETL toolPentahoData Integration (PDI) tool for Data Extraction, transformation, and loading processes. Designing end-to-end ETL processes to support reporting requirements. Designing aggregates, summary tables, and materialized views for reporting.
  • Expertise in developing Data Warehouse architecture, ETL framework, and BI Integration using PentahoReports and PentahoDashboards.
  • Extensive knowledge of Logical/Physical Relational/Dimensional data modeling, star schema, snowflake schema, Fact, and dimensional tables.
  • Extensively worked on transformations such as Merge Join, Filter, Modified JavaScript, Stream Lookup, Update, and SQL Script.
  • Skilled in High-Level Design of ETL DTS Package for integrating data from heterogeneous sources (Excel, CSV, Oracle, MySQL, flat file, Text Format Data).
  • Experience in designing Database Models using Microsoft Visio and creating class diagrams, activity diagrams, use case diagrams, sequence diagrams, and flow charts.
  • Experience in creating various types of chart reports in Pentahohaving Pie Charts, 3D Pie Charts, Line Charts, Bar Charts, Stacked Bar Charts, and Percentage Bar charts.
  • Excellent working knowledge in BI Analytics, Data Warehousing Models, Oracle, SQL, and MySQL.
  • Strong experience in SQL Server with Business Intelligence in SQL Server Integration Services (SSIS).
  • Strong Knowledge & experience in all phases of the Software Development Life Cycle (SDLC) such as developing, testing, Migrating, Administrating, security management, and production support.
  • Strong data analytical and debugging skills to ensure accuracy and data integrity.
  • Ability to adapt quickly to different project environments, work in teams and accomplish difficult tasks independently within a time frame. Quick learner and excellent team player, ability to meet tight deadlines and work under pressure.
  • Excellent analytical, coordination, interpersonal, leadership, organizational, and problem-solving skills, ability to adapt, learn new technologies, and get proficient in them very quickly.
  • Experience working in Agile Methodology and ability to manage change effectively.
  • Hands-on experience to manage Error Handling, Performance Tuning, Error Logging, clustering, and High Availability in Pentaho.
  • Hands-on experience with all phases of Software Development Life Cycle (SDLC) and have Exposure to Agile process and methodologies
  • Hands-on experience in generating source-to-target mapping, validation, transformations, and business rules.
  • Hands-on experience in using multi-format files like XML, .CSV, TXT and loading data from these files into relational tables.
  • Expertise in enhancements/bug fixes, performance tuning, troubleshooting, impact analysis, Unit testing, Integration Testing, UAT, and research skills.
  • Good exposure in projects with the Onsite - Offshore model.
  • Tasks like merging flat files after Creating, deleting temporary files, changing the file name to reflect the file generated to date, and working on 50+ Complex jobs in recent times
  • Flexible to work in an Agile SCRUM environment.
  • Team Player as well as able to work independently with minimum supervision, innovative & efficient, good in debugging and strong desire to keep pace with the latest technologies.

TECHNICAL SKILLS

ETL Tools: Pentaho Data Integration (Kettle) 8.1/9, SSIS/BODS.

DW Tools: ER/Studio, MS Visio

RDBMS: Oracle 10G/9i/8.x, Teradata, PostgreSQL, MongoDB, MySQL, MS SQL Server, MS Access

Languages: SQL, PL/SQL, C, C++, Java, Hadoop, VB, Shell Scripting, Java, and XML.

Operating Systems: Microsoft Windows, MS-DOS, UNIX, and Linux

Reporting Tools: Power BI

PROFESSIONAL EXPERIENCE

Confidential, Alpharetta, GA

SQL/BI ETL Developer

Responsibilities:

  • Installed and Configures Pentaho DI suite 8.2 along with Enterprise Repository in Pentaho BI server.
  • Worked with the business analytics team to identify and gather the requirements.
  • Created various Pentaho Transformations and Jobs using PDI Spoon.
  • Worked on various transformations using API - JSON/XML payloads using
  • Worked in multiple environments as per Confidential policy (DEV/SIT/UAT/PROD)
  • Troubleshoot BI tool problems and provide technical support as needed.
  • Responsible for developing, supporting and maintenance for the ETL processes using Pentaho PDI.
  • Migrating Pentaho Transformations and Jobs from one environment to others by using Pentaho Import/Export utility.
  • Used Mapping Parameters and Variables to implement object orientation technologies and facilitate the reusability of code.
  • Improved the performance of Pentaho ETL jobs, reports by understanding bottleneck operations.
  • Designed and developed UNIX shell Scripts to handle pre- and post-session processes and also for validation the incoming files.
  • Developing the SQL scripts and creating Oracle Objects like tables, Views, Materialized Views, Indexes, Sequences, Synonyms, and other Oracle Objects.
  • Performed Data cleaning by creating tables to eliminate the dirty data using SSIS.
  • Performed various kinds of testing like Unit testing, Regression testing, and system testing in Dev, QA environments before Deployment.
  • Developed Audit Strategy to validate the data between Source and Target System.

ENVIRONMENT: Pentaho Data Integration Spoon 8.2/9, MongoDB, API, SQL Query, MySQL.

Confidential, Dallas, Texas

ETL/SQL Developer

Responsibilities:

  • Involved in Extraction, Transformation and Loading of data Using Pentaho and SSIS.
  • Installed and configured PentahoData Integration Server 8.1 hierarchically from development to production servers.
  • Created enterprise repository in Pentaho BI server to store Jobs and Reports.
  • Used various types of inputs and outputs in Pentaho Kettle like Database tables, MS Access, Excel files, CSV files, Text files.
  • Used PentahoData Integration Designer 8.1 to create all ETL transformations and jobs.
  • Saved Pentaho jobs in repository and scheduled them to run on daily basis.
  • Identified and analyzed data compatibility and data quality issues and works to ensure data consistency and integrity.
  • Used PentahoReport designer to create various reports.
  • Implemented Slowly Changing Dimensions Type 1 and Type 2 in ETL jobs for certain dimensions.
  • Implemented data level security by creating database tables to store usernames, user groups and their allowable permissions, and joined those tables in the report queries to make sure each user should be able to see only the permissible data.
  • Created several dashboards in Pentaho using Pentaho Dashboard Designer.
  • Designed Data warehouse including star schema design, DW capacity planning, performance and tuning.
  • Ran MySQL Import/export to load data between DEV and TEST Servers.
  • Converted Abstract specification in executable java code, and performed unit and integrated testing of different modules.
  • Create documents for application servers to access DB servers, in order to maintain and comply with organizational security standards.
  • Created single value as well as multi-value drop-down and list type of parameters with cascading prompt in the reports.
  • Worked in exposure to cross-platform migrations preferably Oracle to PostgreSQL.
  • Worked with different Sources such as SQL Server, and Flat files
  • Setup sessions to extract data through FTP from remote Oracle and SQL Server data source files and implemented Slowly Changing Dimensions (Type 1 and Type 2).
  • Worked on the project documentation and prepared the Source Target mapping specs with the business logic and involved in data modeling.
  • Verify the logs to confirm all the relevant jobs are completed successfully and timely and involved in production support to resolve the production issues.
  • Wrote SQL, PL/SQL, stored procedures, triggers, and cursors for implementing business rules.
  • Migrated the code and released documents from DEV to QA (UAT) and to Production.
  • Involved in Production support and served as Technical Lead for a Team of Three.

ENVIRONMENT: PentahoBI Server, PentahoData Integration (PDI/Kettle), PentahoMondrian OLAP Server, PentahoDesign Studio, PentahoReport Designer, Pentaho Dashboard Designer, PentahoAnalyzer, Java, MySQL, Oracle 10g, Oracle SQL Developer, SQL Profiler, Windows Server 2008/XP.

Confidential, Southfield, MI

SQL/BI Developer

Responsibilities:

  • Used various types of inputs and outputs like Database Tables, Text files, MS Access, MS Excel, CSV files.
  • Developed SSIS packages to generate reports in Excel sheets for analysis.
  • Implemented Database Stored Procedures, Packages, and Triggers in PL/SQL, Functions and maintained Integrity Constraints.
  • Created SSIS package to get data from different sources, cleanse the data and merge into one single source.
  • Used SQL queries in the Pentaho Interface.
  • Implemented Orders and Points DW using star schema, Orders and Points Business domain using Pentaho Meta Data.
  • Implemented miscellaneous transformations in Kettle Spoon Designer including Database Lookup, Database Join, Calculator, Generate Rows, Mapping Transformation, Filter Rows, Dimension Lookup/Update, Add Sequence, Add Constants, Row Normalizer and Demoralizer.
  • Excellent report creation skills using Microsoft SQL Server 2016/2012 Reporting Services (SSRS) with proficiency in using Report Designer as well as Report Builder and MS SharePoint.
  • Developed Complex Stored Procedures, Views and Temporary Tables as per the requirement.
  • Created complex reports which involve a greater number of groupings and multi-value parameters
  • Created multiple reports with drop down menu option having complex groupings within the report.

ENVIRONMENT: Windows server 2003, IIS 6.0, MS SQL Server- 2016/2012/2008/2005 Enterprise Edition, Pentaho, SSIS, SSAS, SSRS, T-SQL, DTS, SQL Profiler, Power BI, Erwin VB.NET, OLAP, OLTP, Erwin, UNIX, Performance Point Server, Office-2007, MS Excel, C#.NET, ProClarity Analysis, MS SharePoint.

SQL/SAP BODS/Pentaho

Confidential

Responsibilities:

  • Extracted data from Flat Files, Databases and moving into staging area and transform and finally load into the SQL SERVER as target.
  • Modifying the incoming data through Platform, Data Integrator transforms.
  • Creating batch jobs, workflows, dataflow and mappings.
  • Creating sequence and parallel workflows and data flows.
  • Experienced in designing and implementing different types of data flows which includes Case, Merge, Validation, Map Operation, Row Generation transforms etc, using Business Objects Data Services.
  • Used the LOOKUP functions to derive the columns in BODS ETL by look upping the values in lookup table of type validity tables.
  • Build and update ETL jobs to satisfy requirements.
  • Participate in code reviews and technical design discussions.
  • Perform basic testing with ETLs to determine data integrity.
Environment: SAP BODS 4.2, SAP ECC, SQL Server.

Associate Software Engineer

Confidential

Responsibilities:

  • Monitoring daily Pre-Prod and Prod Servers.
  • Extensively worked with Central Repository.
  • Extensively implemented joins.
  • Applied Performance Tuning Techniques in ETL flows.
  • Highly involved in Analyzing source data and Target data.
  • Extensively implemented with different Layers and maintain data reconciliation at all the layers.
  • Extensively involved with Validation Transform and implemented complex logic with the help of lookup.
  • Highly used Lookup Ext function.
  • Used MERGE transform.
  • Move the projects from DEV to QA to PROD by central repo and ATL file.
  • Extensively worked on CMC and data service management console to handle the SAP BODS application.
Environment: SAP BODS 4.2, SQL Server

We'd love your feedback!