We provide IT Staff Augmentation Services!

Audience Discovery (pentaho Data Integration, Mysql, Netezza) Resume

0/5 (Submit Your Rating)

Seattle, WA

SUMMARY

  • Expert in identifying the requirements and also in finding the system requirements
  • Translating and simplifying requirements as per discussions, Requirements management and communication with the client from different perspective of the system
  • Modeling techniques and methods for the Stakeholders, actively participated in data modeling and business modeling
  • Specialization in Various BI tools of Pentaho
  • Good knowledge on System Analysis & Design, Developments, and Testing & Maintenance phases of projects developments.
  • Strong Knowledge of Pentaho BI Tools like: - c-Tools (CDE, CDF, CDA), Pentaho Data Integration 4.0, 4.2, Pentaho BI Server 4.8, Pentaho Report Designer 3.6, Mondrian 3.0, OLAP & Cube, Dashboard, Pentaho Reporting, Cube & Analysis, MDX Query
  • Good experience with SQL and PL/SQL with IBM DB2, MySQL and Oracle databases
  • Good knowledge of Java
  • Experience on development tools like Eclipse 3.0+.
  • Working experience on Application Servers like Jboss 5.0 +.
  • Experienced in working on Operating Systems like Windows, Unix and Red Hat 8.
  • Worked with various development methodologies like SDLC (Waterfall Model), OOAD, Agile and Iterative Software development.

PROFESSIONAL EXPERIENCE

Confidential, Seattle, WA

Audience Discovery (Pentaho Data Integration, MySql, Netezza)

Responsibilities:

  • Responsible for gathering information from clients regarding business requirements and collaborated with business technicians to research existing business and system processes.
  • Interacted with business analysts and End client to understand technical and functional requirements for creating new Job.
  • Developed complex custom reports using Pentaho Report Designer which includes developing Cascading pick-lists, Drill-throughs, Hyperlinks, sub-reports etc, functionality into these reports.
  • Developed several Pentaho Reports, Dashboards, XActions and Analyzer Reports for the client.
  • Designed and deployed custom dashboards on Pentaho User Console.
  • Integrated Pentaho reports and dashboards with the client’s existing front-end application and web portals.
  • Used Pentaho Data Integration 4.0 for ETL extraction, transformation and loading data from heterogeneous source systems such as excel and flat files.
  • Used PDI transformations to cleanse data for duplication, derived values and address parsing.
  • Created complex PDI mappings to load the data warehouse.
  • The mappings involved extensive use of transformations like Dimension Lookup/Update, Database Lookup & Join, generate rows, Calculator, Row normalizer & denormalizer’s, Java Script, add constant, Add Sequence etc.
  • Extensively worked with enterprise Repositories, PDI Job Servers and Enterprise console.
  • Responsible for Debugging and testing of PDI Jobs.
  • Optimized data mappings to achieve faster load.
  • Performed debugging and performance tuning of sources, targets and mappings.
  • Worked with Parameters/Variables in PDI jobs and transformations to achieve automation.
  • Created Java Scripts and worked with Conditional statements and While Loops to implement complex logic.
  • Extensively used “Define Error Handling” to handle exceptions and wrote Scripts to automate the Job Process.
  • Performance tuning of the SQL queries by restructuring the Joins, creation of required indexes to avoid table scans, error handling of queries etc.
  • Loaded Data into Target using from flat files, XML and database tables as source.
  • Performed migration of Data from Excel, Sybase, Flat file, Oracle, and MS SQL Server.
  • Tested and Validated PDI ETL Jobs, monitored Daily PDI ETL schedules.
  • Performed troubleshooting and provided resolutions to ETL issues.

Confidential, Poughkeepsie, NY

ETL Developer

Responsibilities:

  • Participated in Business Analysis & requirement collection. Interacted with clients directly before writing requirement specification documents.
  • Worked very closely with Project Manager to understand the requirement of ETL and reporting solutions to be built.
  • Analyzed the source data coming from various heterogeneous data sources like SAP to be consumed in the ETL to load data into data warehouse.
  • Created mapping documents to define and document one-to-one mapping between source data attributes and entities in target database.
  • Used Pentaho Data Integration Designer 4.3 to create all ETL transformations and jobs.
  • Collecting data from differents ources and cleansingitintoa standard format,analyzing data to discover different patterns, information and generatereports.
  • Designing Pentahodata integration (kettle)jobsto perform ETL activities and reporting datausing Tableau
  • Design and maintain SQL scripts for data analysis and extraction.
  • Used python scripts for efficient functionality.
  • Expanded the scopeof the project,performed ETL activities in Apache Hadoop environment(Hue).
  • Integrated data from such ETL activities and provided detailed analysis
  • Used different types of input and output steps for various data sources including Tables, Access, Text File, Excel and CSV files.

Confidential

Data Engineer

Responsibilities:

  • Performed keyrolein Billing,metering,Finance,Service Order modules.
  • Profiling,extraction and transformation of data(ETL processes).
  • Loading data into CC&B system and generating reports (SAPBO, Cognos)for different teams.
  • Designing and implementing enhancements in the data architecture as perthe requirements shared
  • Tested data by writing complex queries and Document edstatus for different modules.
  • Designed functional queries for generating reports and automated the system.
  • Worked in Waterfall, Agile methodology and anactive member in scrum meetings.

We'd love your feedback!