We provide IT Staff Augmentation Services!

Tableau/bi Developer Resume

5.00/5 (Submit Your Rating)

CA

SUMMARY:

  • ETL Talend Developer 7 years of experience as a Technical analyst, safety Developer and administrator.
  • Extensive experience in IBM Info Sphere Data Stage - ETL integration with SAP bank analyzer functional module performing data extract, loading and transformation process for financial general ledger accounts data.
  • Experience in designing and implementing Data Warehouse applications mainly using ETL tool Talend Data Fabric for Big data integration and data ingestion.
  • Well versed with XML technology and DOM and SAX parsers
  • Experience in UNIX file/dataset management to keep the load ready data for all financial transactional data.
  • Experience on waterfall and Agile methodology project implementation.
  • Involved in complete Software Delivery Life Cycle (SDLC) in large data warehouse environment for financial data system.
  • Thorough knowledge in data warehousing, dimensional modeling, data integration, data virtualization, data synchronization, star schema, waterfall schema, ETL development & performance tuning, BI data analysis, SAP integrating, DFS & HDFS cluster segregation.
  • Worked with senior Bio-statisticians and clinical data managers to provide ETL programs in analyzing the data, generating safety and efficacy loads, summary tables.
  • Excellent oral and written communication skills. A quick learner with an eye for detail and excellent analytical and quantitative skills. Ability to work independently and adept at managing multiple competing tasks.

PROFESSIONAL EXPERIENCE:

Confidential, CA

Tableau/BI Developer

Responsibilities:

  • Extensively worked on T-SQL for creating stored procedures, indexes and functions
  • Implemented custom SSRS Reports to Internal operation team by reading data from the QNXT Database.
  • Added new users and applied roles in the QNXT UI in the Development and UAT Environments.
  • Created Reporting interface for Customer service team by reading the data from QNXT Claim, Claim Detail tables.
  • Implemented ETL Application to load Claim Payment information to QNXT Financial tables and provided access to financial team by creating custom roles in the security management.
  • Used Microsoft Power BI Power Query to extract data from external sources and modify data to certain format as required in Excel, and created SSIS packages to load excel sheets from PC to database
  • Responsible for all the Tableau and PowerBI changes
  • Ad Hoc BI Reporting using Tableau and data visualization in Tableau.
  • Created Dashboards using Power BI and linked them to SSRS Reports.
  • Tableau data visualization and reporting for asset location, cost, and performance.
  • Designed Enterprise reports using SQL Server Reporting Services (SSRS 2008) which make use of multiple value selection in parameters pick list, cascading prompts, drill thru reports, drill down reports, matrix reports and other features of reporting service.
  • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics.
  • Enabled speedy reviews and first mover advantages by using Oozie to automate data loading into the Hadoop Distributed File System and PIG to pre-process the data.
  • Worked on SSRS 2008 to generate about 30+ reports for various user groups and developer groups .
  • Managed and reviewed Hadoop log files.
  • Tested raw data and executed performance scripts.
  • Shared responsibility for administration of Hadoop, Hive and Pig.
  • Deployed all the reports to reporting server, Excel & PDF.
  • Developed Charts, Pie-Charts, Bar-graphs, series on SSRS
  • Created ETL packages with different data sources (SQL Server, Flat Files, Excel source files, XML file) and then loaded the data into destination tables by performing different kinds of transformations using SQL Server Integration Services (SSIS).
  • Responsible for designing the SSIS packages to export data of flat file source to SQL Server Database
  • Involved in enhancement of existing SSIS packages.
  • Good knowledge in deploying SSIS packages to various environments.
  • Implemented in unit testing and created unit test document
  • Supported system testing and helped the testing team in execution of SSIS Jobs.
  • Used various control flow tasks like FTP, Bulk insert task, Conditional Split, Multicast, Merge task and different data flow tasks.
  • Provided many supporting documentation for the project.

Environment: MS-SQL Server 2008/R2 Management studio, T-SQL, MS Excel, Windows-7, SSIS, SSRS, SSAS, Tableau, Power BI.

Confidential, Brentwood, TN

ETL Talend Developer

Responsibilities:

  • Developed complex ETL jobs from various sources such as SQL server, Postgressql and other files and loaded into target databases using Talend ODS ETL tool.
  • Created Talend jobs using the dynamic schema feature.
  • Created the Talend jobs for Store Support Center, Coupons jobs and Promotions etc.
  • Interact with business community and gathered requirements based on changing needs. Incorporated identified factors into Talend jobs to build the Data Mart.
  • Performance tuning - Using the tmap cache properties, Multi-threading and Parallelize components for better performance in case of huge source data. Tuning the SQL source queries to restrict unwanted data in ETL process.
  • Involved in Preparing Detailed design and technical documents from the functional specifications.
  • Prepared low level design documentation for implementing new data elements to EDW.
  • Provided Knowledge Transfer to the end users and created extensive documentation on the design, development, implementation, daily loads and process flow of the mappings.
  • Have used AWS components (Amazon Web Services) - Downloading and uploading data files (with ETL) to AWS system using S3 talend components.
  • Used more components in Talend and Few to be mentioned: tMap, tFilterRow, tjava, toracle, txmlMap, tdelimited files, tlogrow, tlogback, components etc. in many of my Jobs Design
  • Worked on Joblets (reusable code) & Java routines in Talend.
  • Design, Develop and Test ETL processes in order to meet project requirements
  • Created Projects in TAC and Assign appropriate roles to Developers and integrated SVN (Subversion).
  • Used to be On call Support if the Project is deployed to further Phases
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis (Cron Trigger)

Environment: Talend Open studio V (5.6), UNIX, AWS-S3, Microsoft SQL Server management Studio, Postgres SQL, Netezza Database, Oracle Database, XML Processing.

Confidential, CA

ETL Talend Developer/Admin

Responsibilities:

  • Participated in all phases of development life-cycle with extensive involvement in the definition and design meetings, functional and technical walkthroughs.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Created and managed Source to Target mapping documents for all Facts and Dimension tables.
  • Design, develop, validate and deploy the Talend ETL processes for the DWH team using HADOOP (PIG, HIVE) on Hadoop.
  • Utilized Big Data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Extensively used tMap component which does lookup & Joiner Functions, tjava, tOracle, txml, tdelimtedfiles, tlogrow, tlogback components etc. in many of my Jobs Created and worked on over 100+components to use in my jobs.
  • Developed multiple tracks (Modules) ­ for the data migration between multiple different systems like MRA, TIPP to NRSC, LIM to JAS, EPM to MR, ONW to TIPP.
  • Used ETL methodologies and best practices to create Talend ETL jobs. Followed and enhanced programming and naming standards.
  • Involved in Data Extraction from Flat files and XML files using Talend by using Java as Backend Language.
  • Created and deployed physical objects including custom tables, custom views, stored procedures, and Indexes to SQL Server for Staging and Data-Mart environment.
  • Develop ETL mappings for various Sources (.TXT, .CSV, XML) and also load the data from these sources into relational tables with Talend Enterprise Edition.
  • Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.
  • Handled importing of data from various data sources, performed transformations using Hive, Map Reduce, Spark and loaded data into HDFS.
  • Experience in using DOM4J parsing and XML Bean to process, validate, parse and extract data from XML files.
  • Implemented installation and configuration of multi-node cluster on cloud using AWS on EC2.
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more).
  • Copy data to AWS S3 for storage and use COPY command to transfer data to Redshift. Used Talend connectors integrated to Redshift.
  • Developed jobs to expose HDFS files to Hive tables and Views depending up on the schema versions.
  • Imported data from RDBMS (MySQL, Oracle) to HDFS and vice versa using Sqoop. (Big Data ETL tool) for Business Intelligence, visualization and report generation.
  • Worked on Talend Administration Console (TAC) for scheduling jobs and adding users.
  • Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.
  • Automated SFTP process by exchanging SSH keys between UNIX servers. Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.

Environment: Talend Data Integration 6.4.0, Talend Enterprise Big Data Edition, TalendAdministrator Console, XML, Oracle 11g, Hive, HDFS, Sqoop, SQL Navigator, Toad, Control M, Putty, Winscp.

Confidential, AR

ETL Talend Developer

Responsibilities:

  • Interacted with business team to understand business needs and to gather requirements.
  • Designed target tables as per the requirement from the reporting team and also designed Extraction, Transformation and Loading (ETL) using Talend.Worked on Data Integration from different source systems.
  • Created Technical Design Document from Source to stage and Stage to target mapping. Worked with Talend Studio (Development area) & Admin Console (Admin area).
  • Created Java Routines, Reusable transformations, Joblets using Talend as an ETL Tool.
  • Created Complex Jobs and used transformations like tMap, tOracle (Components), tLogCatcher, tStatCatcher, tFlowmeterCatcher, File Delimited components and Error handling components (tWarn, tDie).
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Identified performance issues in existing sources, targets and Jobs by analyzing the data flow, evaluating transformations and tuned accordingly for better performance.
  • Manage all technical aspects of the ETL Jobs process with other team members.
  • Developed mappings to load Fact and Dimension tables, SCD Type 1 and SCD Type 2 dimensions and Incremental loading.
  • Created contexts to use the values throughout the process to pass from parent child to child jobs and child to parent jobs.Worked on Joblets (reusable code) & Java routines in Talend.
  • Expertise in Service Oriented Architecture (SOA) and Involved in publishing Web Services that include components like WSDL, SOAP, UDDI.
  • Performed Unit testing and created Unix Shell Scripts and provided on call support. Schedule Talend Jobs using Job Conductor (Scheduling Tool in Talend) - available in TAC.
  • Used XML for ORM mapping relations with the java classes and the database.
  • Retrieved data from Oracle and loaded into SQL Server data Warehouse.
  • Created many complex ETL jobs for data exchange and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structure.
  • Monitoring the Data Quality, generating weekly/monthly/yearly statistics reports on production processes - success / failure rates for causal analysis as maintenance part and Enhancing existing production ETL process.

Environment: Talend Platform 6.2, Big Data, UNIX, Oracle, XML, Oracle, TAC.

Confidential

ETL Talend Developer

Responsibilities:

  • Worked closely with Business Analysts to review the business specifications of the project and also to gather the ETL requirements.
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components.
  • Created and managed Source to Target mapping documents for all Facts and Dimension tables
  • Analyzing the source data to know the quality of data by using Talend Data Quality.
  • Involved in writing SQL Queries and used Joins to access data from Oracle, and MySQL.
  • Prepared ETL mapping Documents for every mapping and Data Migration document for smooth transfer of project from development to testing environment and then to production environment.
  • Design and Implemented ETL for data load from heterogeneous Sources to SQL Server and Oracle as target databases and for Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2.
  • Utilized Big Data components like tHDFSInput, tHDFSOutput, tPigLoad, tPigFilterRow, tPigFilterColumn, tPigStoreResult, tHiveLoad, tHiveInput, tHbaseInput, tHbaseOutput, tSqoopImport and tSqoopExport.
  • Used Talend most used components (tMap, tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and many more)
  • Created many complex ETL jobs for data exchange from and to Database Server and various other systems including RDBMS, XML, CSV, and Flat file structures.
  • Experienced in using debug mode of talend to debug a job to fix errors.
  • Responsible for developing, support and maintenance for the ETL (Extract, Transform and Load) processes using Talend Integration Suite.
  • Conducted JAD sessions with business users and SME's for better understanding of the reporting requirements.
  • Developed Talend jobs to populate the claims data to data warehouse - star schema.
  • Used Talend Admin Console Job conductor to schedule ETL Jobs on daily, weekly, monthly and yearly basis.
  • Worked on various Talend components such as tMap, tFilterRow, tAggregateRow, tFileExist, tFileCopy, tFileList, tDie etc.
  • Worked Extensively on Talend Admin Console and Schedule Jobs in Job Conductor.

Environment: Talend Data Integration 5.5.1, Talend Enterprise Big Data Edition 5.1, XML, Talend Administrator Console, MS SQL Server 2012/2008, Oracle 11g, Hive, HDFS, Sqoop, TOAD, UNIX.

We'd love your feedback!