Etl+bi Developer Resume Profile
AZ
Background Summary
- Extensive Experience in building Database and Data warehouse structures, creating, populating and maintaining Data Marts, Business Intelligence, Decision Support systems and Corporate Performance Management solutions including Dashboards, Scorecards, Query Analysis reports and KPI using MS SQL Server Analysis Services SSAS , MS SQL Server Integration Services SSIS , MS SQL Server Reporting Services SSRS , IBM's COGNOS, TERADATA, Informatica, DMExpress, Talend, CLOUD Computing, BIGData.
- Capable of handling responsibilities Independently and as a Team member. Built various types of Reports such as Parameterized, Chart, Graph, Linked, Drill-Down, Drill-Through, and Cascading. Well versed in Data mappings, Data import interfaces, KPI based measures, Data Mart analytics for OLAP OLTP, Data Visualization tools such as DUNDAS, Tableau analytics.
- Certified in Oracle PL/SQL Developer with proficient skills in SQL, PLSQL, SQL Plus, T-SQL and MySQL.
- Experience in Implementing MS SQL Server Management Studio SSMS , Created SQL Tables, Views, Constraints, SQL Functions, Indexes, Set Operators, Sequences, Synonyms, Joins, Sub Query's, and T-SQL Stored Procedures, Triggers, User Defined Functions UDF's , Dynamic SQL.
- Proficient in Developing Oracle PL/SQL complex Store Procedures, Collections Records, types of Triggers, Functions, Exceptions, Ref Cursor's and Cursor Variables etc.,
- Experience in Extracting, Transforming, and Loading data using MS SQL Server Integration Services and Experience with defining DSV with name query, views, named calculation building, deploying and processing cubes, Measures, Dimensions, Attribute relationship, KPIs, Mining Structures, MDX Scripts, MDX Expressions, MDX Functions and Populated OLAP cubes from heterogeneous data sources such as OLEDB, ADO.net, ODBC, RDBMS, Excel, ASCII flat files, RDF, XML and Web services sources/destinations.
- Experience in SQL Mail Agent, Schedule SSIS Packages, Data analysis, OLAP Cube design, Cube analysis and Enterprise wide Data warehousing applications/solutions.
- Proficient in Performance Tuning in MS SQL Server using SQL Server Profiler, Query Analyzer, Enterprise Manager, Index Tuning Wizard, Windows Performance Monitor, Database Backup, Recovery.
- Extensively used ETL tools such as Informatica 9.1/8.6/8.0 MS SSIS, methodologies for supporting data extraction, transformations and loading processing using various transformations such as Source Qualifier, Aggregate, Lookup- Type I, II, III, Rank, Incremental, Joiner, Filter, Router, Update Strategy etc.,
- Proficient in Data modeling, Data mapping identification documentation, Data Extraction and Load Process from multiple Data Sources, Data Verification, Data Analysis, Data Cleansing, Transformation, Integration, Data Import, Data Export, Attributes and Hierarchies.
- Proficient in using variety of Design Development tools such as ERwin, Embarcadero's ER/Studio, and TOAD, Oracle SQL Developer and MS Office suite such as MS Excel, MS PowerPoint, MS Visio, MS Word for requirement analysis specifications.
- Extensive exposure of Ralph Kimball's Data Warehousing Methodology and Expertise in building data warehouses based on STAR, SNOW FLAKE, GALAXY, and Extended STAR Schemas, data marts, and staging areas.
- Expertise in Business Intelligence Solutions on Predictive Analytics Data Mining Algorithms.
- Good knowledge of HIPPA Regulations, EDI Transaction Codes such as Benefit Enrollment 834, Health Care Claim 837, ICD 9/10, HL7 Standards, and understanding of HMO PPO insurance policies.
- Good Knowledge of Functional and Geographic Flowcharts, Check sheets, Histograms, Run charts, Control Charts, Cause and Effect diagrams, interrelationship diagrams, Pareto Charts, Scatter diagrams, Affinity diagrams.
- Expertise in NoSQL Database which utilizes various models such as Column Store, Key-Value Store,
- and Document Store data models and in various document store databases such as MapReduce techniques such as Mongo DB, AWS's DynamoDB and in .Net Frameworks, MVC, MVVM, MVP, WCF, WPF, WWF, LINQ to SQL, LINQ to XML, LINQ to Objects and RDBMS TERADATA BTEQ etc.,
- Flexible, enthusiastic and project oriented team player with an Excellent written, verbal communication, documentation, and leadership skills to develop creative solutions for challenging client needs.
Tools and Technology:
| MS SSAS MS SSRS 2012/2008R2/2005, SharePoint 2010, PerformancePoint, MS Excel, Tableau 8.1, Dundas 5.0, TOAD |
| MS SSIS on SSMS 2012/2008R2/2008/2005, Informatica Power Center 9.1/8.6/7.1, Talend, DMExpress |
| ERwin 9/7/4.1/4.0/3.5, MS VISIO, ER Studio, Primavera |
| MS SQL Server 2012/2008 R2/2008/2005, MS Access, Oracle 11g/10g/9i, MySQL, TERADATA 14/13 |
| T-SQL, SQL Plus, PL/SQL, BTEQ, HTML, XML, Basic Unix Shell |
| TOAD 8.5, SSMS, SQL Plus, SQL Analyzer, SQL Loader, Explain Plan, Query Analyzer |
| MS Project, MS Visio, MS Excel, MS Excel PowerPivot PowerView, MS Word, MS PowerPoint, VSS Visual Source Safe |
| Windows 2007, 2003, XP, Vista,7, 8, Red Hat Enterprise Linux Unix |
| Ralph Kimball's Data Warehouse Dimensional modeling, Star, Snow Flake, Galaxy, and Extended Star Schema's, OOAD, RUP |
| Mongo DB, AWS's Dynamo DB, Hadoop HDFS, CLOUD Computing |
| MS Office, MS Outlook |
| ASP.Net, VB.Net, C .Net, WCF, WPF, WWF, LINQ to SQL, LINQ to XML, LINQ to Objects |
Professional Experience
Confiential
ETL BI Developer, Confiential
CNTelligence is an online multi tenant Relationship Management solutions provide all Camps and businesses 3600 of sales desk, campaign management, relationship management, brand management, reputation management, Help desk and reviews feedback management. Helps businesses provide better customer service, cross-sell and up sell more effectively, close deals, retain current customers and better understand who the customer is.
Responsibilities:
- Identified the project requirements thru' JAD sessions to build the various Datamarts and Business Intelligence System and maintains the procedure analysis, and corporate budget analysis.
- Used ERwin to develop the Conceptual Data Models CDMs and Logical Data Models LDMs for Enterprise to explore the domain concepts, and their relationships.
- Developed Physical data models PDMs to design the internal schema of a database, depicting the data tables, the data columns of those tables, and the relationships between the tables.
- Created Table Column level Constraints, data types, expressions, conditions, functions, and applied Primary Key Foreign Key for relationship references. Executed Complex SQL queries such as DDL, DML, TCL, DQL, DCL statements, types of Joins, Sub Queries such as Scalar Queries, Nested Queries, Correlated Queries, View, Sequence, Aliases, and Synonyms.
- Implemented T-SQL, such as Predefined Stored Procedures, User Defined Functions UDF's , Triggers to accomplish various operations functionalities and utilized Scalar/Table Variables, Temp Tables, selective structure, controlling structure, Cursor types, Try catch Exceptions, Dynamic SQL to accomplish various types of special effects. Used COMMIT ROLLBACK transaction control statements to complete the transaction.
- Created Indexes such as Clustered Non-Clustered to develop and advanced higher querying performance for tables with huge records.
- Performed data partitioning for efficient management of database tables, indexes and performance issues such as update statistics, Auto Stats.
- Involved in Designing, Developing and Testing of the automated the process of dynamically selecting source system feeds files Extracting, Transformation and Loading ETL the data into the database using SSIS packages.
- Executed data load reviews, analysis and verification of ETL logic design for data warehouse and data marts in STAR and/of Snow-Flake Schema methodology with conformed dimensions and fact tables. Reviewed and verified ensuring that ETL packages are built in desired way to refresh the DW.
- Designed and Created ETL Packages with different Transformations for loading the data from Heterogeneous sources into target data by performing various kinds of tasks and transformations such as Execute Package Task, Execute SQL Task, Derived Column, Fuzzy Lookup, Conditional Split, Lookup, Multicast, Merge, Aggregate, Pivot, Sort.
- Used SQL Server Integration Services, Updated newest data into data warehouse by using Slowly Changing Dimensions such as Type1, Type2 and Type3 transformation's.
- Built debugged packages to perform workflow functions such as FTP operations, executing SQL statements, and sending email messages.
- Created OLAP Cubes by using SQL Server Analysis Services.
- Extracted data from different Cubes and generated Ad-hoc Reports and Parameterized reports using the KPI's, drill-through targets to another Report using SQL Server Reporting Services. Defined project with client clarifying business objectives, identifying meaningful metrics and establishing data requirements. Analyzed hourly snapshots collected over few months to build statistical models.
- Used SQL Server Reporting Services to generate multi-dimensional reports such as Graphical, Tabular, Matrix, Drill down, Sorting, Ranking, Slice and free form reports from multiple Data Sources such as Relational database files, Flat files and XML data sources.
- Used SQL Server Analysis Services and Identified key measures dimensions to build dimensional model assigned hierarchy, attribute relationship to the various dimensions and measures defined.
- Implemented SQL Server Profiler for fine tuning with the various SQL statements, Query optimization, Trace utilities.
- Deployed Reports in SharePoint Server, to create, publish, and share reports.
- Extracted data from SSAS SSRS, and used PerformancePoint to create design Powerful Dashboards, Scorecards, and Reports and publish them to a SharePoint Server.
- Used PowerPivot PowerView in MS Excel to create, export, print large and multi-table data from heterogeneous sources and populated reports.
Environment: MS SQL Server 2012, SSMS, MS SQL, T-SQL, MS SQL Server Integration Services, MS SQL Server Analysis Services, MS SQL Server Reporting Services, ERwin9.1, XML, PerformancePoint, Share Point 2010, MS Excel, MS Excel PowerPivot, PowerView, Windows.
Confidential
BI Developer,
Confidential is one of the best ways to make appointments online for patients and benefits to Doctors/Dentists to see new patients build business practice on minimal expense. The website is a leading online resource for comprehensive information about Physicians, Dentists, Clinics, Hospitals and Insurance to enable to find the Consultants with list of available appointments to choose accepting patient insurance to avoid any service delays. Customers/Patients are empowered through the portal to get objective information about clinical outcomes, satisfaction, safety health conditions to make more if normed healthcare decisions.
Responsibilities:
- Reverse engineering the components of the maintenance management system to implement physical/logical data models of the system using ERwin Data Modeler.
- Created Simple Complex Tables, Views, Joins, Sub Query's, Synonyms, Sequence to access modified Tables in database.
- Implemented Stored Procedures, Triggers, User Defined Functions, and Dynamic SQL to execute the business rules.
- Established Table Row level Triggers to monitor, prevent, and record object changing in database.
- Used SQL Server Profiler to optimize and tune the various SQL Queries and Stored Procedures to support BigData.
- Used SSAS to organize data in Data warehouse to form OLAP Cubes and created various data marts by configuring data source data source viewer in SSAS projects Wizard. Arranged data into Measures Attributes in OLAP Cubes.
- Deployed SSAS OLAP Cube and Enhanced Cube by editing dimensions to add attributes, constructing Calculations, KPIs, and Hierarchies. Executed Multi-Dimensional Expressions MDX to query OLAP Cube produce results.
- Used SSRS, developed deployed various types of reports such as Matrix, Drill down, Ad-Hoc, Cross tab, Cascading and Parameterized reports.
- Developed and Maintained cubes as per the technical specifications by using SSAS. Applied MDX queries to create custom PIVOT Reports and calculated members in the CUBEs.
- Deployed Powerful Dashboards in the Tableau Desktop to visualize and analyze the data and connecting the results with other servers by using Tableau Server.
- Maintained cross-functional information integrity by designing and developing Tableau BI dashboards, scorecards, charts/graphs, maps, spotlights, and dynamic reports to meet business needs and it's used to present the visualization data in a quicker and easier way.
Environment: MS SQL Server 2012, SSMS, MS SQL, T-SQL, MS SQL Server Analysis Services, MS SQL Server Reporting Services, MDX, Tableau 8.1, ERwin9.1, MS Office, MS Excel, Windows.
Confidential
ETL Developer,
MMIS is an Advanced Software Technology Application Module that generates required Materials Bill of Quantity from project drawings complex lay-outs and bridges the gap in Design, Procurement, Accounting and Materials control to make Data Management effective efficient. This application provides unique codes for all types of materials, equipment and consolidates the requirements on a project level to for optimizing purchase performance and inventory.
Responsibilities:
- Participated in multiple JAD sessions to collect the business requirements from various Stakeholders for Business Analysis and Technical Staff to Develop Entity Relationship/Data models, Requirements Document and ETL Specifications. Prepared and analyzed the AS-IS and TO-BE workflow scenarios.
- Converted Business Requirement Document BRD into High level and Low level design and created various PPT files for presentation.
- Performed Installations and Configurations of Oracle Databases.
- Participated in ERwin Design Customized various Data models such as LDM, PDM, for Data warehouse supporting data from multiple sources.
- Identified required Dimensions and Facts columns from the OLTP System.
- Used MS Excel sheets for Collecting and Linking/Mapping metadata from Heterogeneous sources targets, including relational databases and flat files.
- Created sequential batches and concurrent batches for sessions.
- Developed PL/SQL procedures/packages to migrate the data into Oracle database.
- Supported for Daily loads and worked with business users to handle rejected data.
- Used SQL Loader tool to load the bulk data into Oracle Database
- Prepared Test scenarios Test cases and Involved in Unit testing, Integration testing, System testing and User acceptance testing UAT .
- Developed Interfaces using UNIX Shell Scripts to automate the bulk load updated Processes.
- Migrated mappings from Development to Testing and from Testing to Production.
- Performed Query Optimization and ETL/SQL Queries Tuned for better performance.
- Used MS Excel MS Word for tracking defects reports.
Environment: Oracle 11g SQL, PLSQL, SQL Loader, TOAD, Oracle SQL Developer, TERADATA, ERwin4.1, MS Word, MS Excel, MS PowerPoint, Unix/Linux.
Confidential
PL/SQL Developer,
HRMPS is a customizable technical solution product module specializing in Organizational Human Resource's Goals Policies. This application enables a better understanding of employees, smarter hiring decisions improves efficiency. The wage payments module helps reduce operating expenses plus an option for automated paperless payroll processing and electronic payment distribution.
Responsibilities:
- Established ER data modeling such as Conceptual Logical data modeling by using ERwin tool.
- Created various SQL queries such as Tables, Views, Constraints, Operators, Indexes, Sequence, Synonyms, Materialized View, JOINS such as Left/Right/Full/Self, Sub Query's such as Correlated, Scalar Nested Sub Queries.
- Constructed complex SQL queries with sub-queries, inline views as per the functional needs in the Business requirement document.
- Extensively used SQL Group By, Having, Order by, TOP clauses, and created PL/SQL programming, and Oracle partitioning to extract data from large database.
- Enforced database integrity using primary keys and foreign keys to Import Data from flat files into Oracle database through staging tables using SQL LOADER.
- Performed Historical data fixes for the data validation and created DDL scripts for implementing data modeling changes.
- Implemented Dynamic SQL, PL/SQL Collections, Records and Exception handling.
- Developed Functions Procedures to manipulate the data and to load data into the tables and packages for the back end processing of the proposed data base design.
- Created various PL/SQL code such as Cursors, Dynamic SQL, Collections Records, Stored procedures, Stored Functions, types of Triggers, and Packages to retrieve information in detail.
- Implemented Exception handling methods by using system-defined user-defined exceptions.
- Used MS Visual Source safe for version control and sharing file utilized. Involved in Unit testing and System Integration testing for PL/SQL Stored Procedures and Functions. Analyzed queries using SQL Trace facility and Explain Plan utility to obtain the execution process.
- Performed Performance tuning for complex SQL queries by eliminating redundant joins, creating functional indexes, and removing redundant code.
- Used TOAD for Oracle for modifications to the existing Procedures, Functions, and Packages as per the requirement.
- Optimized the queries by modifying the data access methods, Index strategies, Join types, operations and providing hints.
- Responsible for data load reviews, analysis and verification of ETL logic design for data warehouse and data marts in STAR Schema methodology.
- Performed in Linux Environment implemented commands such as TAR, GREP, SSH, Find, SED.
Environment: Oracle 10g, 9i, SQL, PL/SQL, SQL Loader, TOAD, ERwin 4.1, TERADATA BTEQ, Unix/Linux, Windows, MS Office, MS Excel, MS PowerPoint, MS Visual Source Safe.
Confidential ETL Developer, |
The database was established to analyze performance statistics of manufactured products and recommends improvements on rejected products and time elapsed between the machine processes. Prototyping solutions for market analysis to find potential clients in the electronic manufacturing industry. To set up prototypes to enable a decision maker to monitor a number of key aspects of the organizations manufacturing operations.
Responsibilities:
- Involved in Software Development Life Cycle SDLC including Analysis, Design, ETL strategy, Reporting, and Identifying facts dimensions.
- Prepared the required application design documents based on functionality required.
- Designed the ETL processes using Informatica to load Heterogeneous data sources from SQL Server, Oracle 9i, Flat Files and Excel files to staging database and from staging to the target SQL Server 2008 Data Warehouse database.
- Implemented the best practices for the creation of mappings, sessions, workflows and performance optimization.
- Created ETL Mappings using Transformations such as Source Qualifier, Joiner, Router, Aggregator, Expression, Filter, Connected Un-Connected Lookup, Update Strategy, Stored Procedure, and Sequence Generator.
- Designed and developed the logic for handling Slowly Changing Dimensions SCD's tables loaded by flagging the record using update strategy for populating the desired.
- Involved in Data Cleansing/Scrubbing and Extraction of data and defined quality process for the data warehouse and in Performance tuning and optimization of Informatica mappings, sessions using features such as partitions data/index cache to manage very large data volume.
- Documented ETL Test plans, Test cases, Test scripts, Test procedures, assumptions, and validations based on design specifications for expected results, prepared test data and loaded for testing, error handling and analysis. Involved in migration of mappings, sessions from development repository to production repository.
- Created T-SQL Complex Stored Procedures to transform the Data and for various needs of the transformations while loading the data.
- Involved in Unit Testing, Integration Testing, and User Acceptance Testing to check whether the data is loading into target, which was extracted from different source systems according to the user requirements and in production support and performed with various mitigation tickets created while the users working to retrieve the database.
Environment: MS SQL Server 2008 R2/2008, SSMS, MS SQL, T-SQL, ERwin 4.0, Informatica Power Center 8.6, Oracle 9i, PL/SQL, MS Excel, Windows.
Confidential Business Intelligence Analyst, |
The Purpose of ISS project was used for upgrading services, marketing analysis to find potential customers and to locate inventory on LIFO and FIFO methods designed to lower the costs. The database was established from various data-marts to organize inventory data, including various retail stores information. Reported the business owners about the monetary amount of the inventory, physical assets, and various counts of items in the store. ISS offers a structured approach covering the conversion of custom code, and data migration.
Responsibilities:
- Redesigned the existing logical and dimensional data models to accommodate Discovery Analysis of Business Process Review, Requirements Gathering, Gaps and Areas of improvements.
- Implemented standard program upgrade and object clean up and removal of unwanted fields in the object from the database.
- Incorporated additional facts and dimensions using ERwin 4.1 and Coordinated closely with the business users and the IT team while reengineering the data model.
- Designed functional upgrade, custom solutions upgrade, and manual migration of modified objects.
- Performed 1st and 2nd Data Migration and Test data integrity.
- Designed and Implemented BI best approach solutions working with other BI developers and ensure high level of BI availability and rapid response time.
- Used various MS SQL queries as Joins Inner/equi, Self, Left/Outer, Cross , Sub Query's Nested Correlated , Tables, Views and Used MS T-SQL codes such as Triggers, Stored Procedures, User Defined Functions UDF's to create Ad-Hoc reports based on the business requirements.
- Collected users feedback based on all functionalities and set up live environment with migrated programs and data to start live transactions.
- Used SSRS to Create templates for standard reports and extensively used various formatting features and provided support and/or training to end users.
Environment: MS SQL Server 2005, MS SQL Server Management Studio, MS SQL, T-SQL, SQL Server Reporting Services, TERADATA BTEQ, Windows
Confidential Business Analyst, |
The CMS Service System project involves developing data marts from their existing Excel, Flat files and databases such as SQL Server, Oracle. Sales data mart is one of their major database applications that extracts data from various operational systems and loads into data marts. This data mart is being used by the reporting applications to generate Operational, Transactional and Business Reports to provide the best customer service and performance statistics for business improvement.
Responsibilities:
- Determined operational objectives by studying business functions gathering information Evaluating output requirements and formats.
- Defined project requirements by identifying project milestones, phases, and elements formed project team and Established project budget.
- Executed As-is and To-be analysis for the current state and future state process flows.
- Performed Analysis of Business Requirement Document BRD and Functional Specification Document FSD . Developed Test Strategy, Test Plan and Test Cases and documented the defects to rectify on timely manner and Developed policies, procedures, process flows, Business, and Functional, Non-Functional and detailed test cases to identify risks for operations.
- Enhanced performance measures to monitor and report the quality and timeliness of process related information by providing requirements, USE Cases and Test Cases. Formalized business requirements into system requirements for implementation by development groups.
- Created USE CASES for the requirements and coordinated with the development team to understand the requirements and Developed USE CASES, Functional Requirements and Business Flow Diagrams using MS Visio.
- Implemented business optimization projects, tracked Business requirement traceability matrix RTM , maintained the changes in the requirements in RTM and assisted QA team to perform the Test.
- Assisted Project Team in developing the Request for Proposal RFP and Change Management Request CMR to choose a planning to meet the business needs.
- Maintained user confidence and protects operations by keeping information confidential and Prepared Technical reports by collecting, analyzing, and summarizing information and trends.
Environment: MS Office 2007, MS Word, MS Excel, MS Power Point, MS Visio, UML, Visual Source Safe, Windows, ASP.Net Framework
PMIS/SQL Developer Confidential
Shell Petrochemical is one of the global business groups in the diversified business. The project involves construction of world's largest petrochemical facility in Jubail Industrial City. We maintained data warehouse for OLTP and OLAP databases for the project reporting needs. Developed Software applications for materials assessment from project drawings and making bill of quantities for the petrochemical projects.
Responsibilities:
- Designed Scheduled Functional requirements and queries for reports to enable contractors to verify the project drawings with material take-off sheets.
- Generated Reports using Global Variables, Report expressions, Functions, Designed various reports such as dynamically driven parameterized reports, reports with sub reports, drill through reports, using PMIS reporting services for the chemical construction projects.
- Performed various Database Tasks such as browsing database objects, creating new objects, viewing dependencies between various objects, Security Management, monitoring Sessions, and viewing/modifying Database Parameters.
- Coordinated with KBR Alhambra, CA and KBR Houston, TX offices for Project Engineering, Procurement and Logistics data updates.
- Used Primavera software to schedule the jobs and coordinated with the PMIS for logistics input.
- Generated reports on the basis of project drawings bill of quantities by executing SQL queries.
Environment: Unisys PMIS SQL, MS Office, MS Word, MS Excel, MS Powerpoint, MS Visio, MS Access, Primavera
Confidential
Business/Finance Analyst,
Trident Steels Finance Group was engaged primarily Steel manufacturing, leasing and financing business. Also working as Steel converting agents for Steel Authority of India. Our project was responsible for Public Issue, Finance, Procurement, operations, and support customers by supplying information solutions data services.
Responsibilities:
- Used TMS Trident Management system , Tally and EX Package for project management, operations, finance, accounts, procurement, and engineering to maximize material availability and minimize surplus through required aspects of planning/scheduling.
- Monitored daily status about Job progress schedules, Finance status and reconciliation accounts, and coordinated with various banks for bill of exchange and participated in expenditure forecasts.
- Prepared, accumulated, and maintained of materials control files, such as requisitions, purchase orders, etc by using TMS. Coordinated with Mumbai Stock Exchange for public issue of Trident shares debentures. Coordinated with various Banks to open letter of credits to finance the procurement.
- By using FoxPro and Clipper, developed custom software package called TMS for the above tasks.
- Used Lotus123 and WordStar, for all business correspondence spreadsheet requirements.
Environment: FoxPro, Clipper, Tally Package, EX Package, Lotus123, WordStar
Professional Certifications Training:
- Oracle Certification-OCA SQL, PL/SQL
- IIBA Business Agile System Analyst
- Dimension modeling STAR SNOW Flake Data base design, QA ETL Testing by VENSOFT Inc
- Implementing ETL Data Warehouse and Essentials of Data warehouse design by NETG
- Predictive Analytics by Quanta Intelligence
- ETL Talend Open Data Studio, TERADATA RDBMS - Data integration and Big Data