We provide IT Staff Augmentation Services!

Bi Technical Lead Resume

Irvine, CA


  • Almost 10 years of progressive hands - on experience in Data warehousing and ETL processes using Informatica, Talend and Report writer.
  • Excellent understanding of ETL, Dimensional Data Modeling techniques, Slowly Changing Dimensions (SCD) and Data Warehouse Concepts - Star and Snowflake schemas, Fact and Dimension tables, Surrogate keys, and Normalization/De-normalization.
  • Design the OLTP and OLAP Data Models (star and snowflake schema)
  • Experience in Data Warehouse/Data Mart Development Life Cycle in Kimball and Inman Approach.
  • Expertise in DWH technical architecture, design, business requirement definition and Data Modeling. Responsible for designing, coding, testing, integrating the ETL processes and implementing the reporting requirements.
  • Adept with Informatica Power Exchange, Power Center, Cloud-services (ICS) and Data quality (IDQ).
  • Well versed with ETL procedures to load data from different sources like Oracle, Netezza, DB2, flat files, XML files into DWH and adept with DDL and DML in RDBMS world.
  • Usage of Python scripting for SFTP files to vendors and compliance and tumbleweed the files from Vendor.
  • Adept on sourcing web services data into data warehouse platforms
  • Created data lakes for the data coming from salesforce applications and front end UI
  • Big Data Analytics in Hive Databases
  • Usage of Tivoli and Control-M to schedule ETL jobs in production.
  • Design and develop code in SQL and PL/SQL. Comfortable developing UNIX shell scripts to run SQL scripts and Informatica workflows from UNIX server.
  • Analytical and Technical aptitude with ability to work in a fast paced, challenging environment and keen to learn new technologies.
  • Working expertise in Agile/Scrum and Waterfall Methodologies.
  • Experience being Scrum Master and handling project size from 3-20.
  • Proficient in understanding business processes/requirements and translating them into technical requirement specifications.
  • Excellent interpersonal and communication skills, technically competent and result-oriented with problem solving skills and ability to work effectively as a team member as well as independently.
  • Developed effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions and effectively manage client expectations.


ETL Tools: Informatica Power Center 10.X/9.x/8.x/7.x, Pentaho, OLAP, OLTP, Talend

Reporting Tools: Anaplan, Business Objects, Micro-strategy and Tableau.

Env Tools: Tivoli/Control-M(scheduler), Clear case(Version), Hudson(deployment)

Databases: Netezza 6.X, Oracle 12c/10g/9i/8i, MS SQL Server, MySQL

Languages: SQL, PL/SQL, Java, C, Shell Scripting, Perl, PHP, XML, HTML

Data Modeling: MS Visio, Erwin

Tools: SQL Developer, Toad, SQL*Plus, Auto-Sys, MS Office.

Environment: Unix, Windows 7/XP/Vista, Linux, Mac OS X

IT Concepts: Data Structures and Algorithms, Software Development Life Cycle.


Confidential, Irvine, CA

BI Technical Lead


  • Techno-Functional role in business process.
  • Knowledge on Sales process Management including transaction process, compensation, territory assignment.
  • Functional knowledge on mutual funds and non-mutual fund transaction life cycle.
  • Business analysis and capture requirements in terms of user stories and BRD.
  • Collaborate with business stakeholders to capture user and project requirements, document data flows, create clear and concise output and/or reports, functional testing.
  • Scrum Master to the IT team on agile methodology implementation.
  • Convert BRD to FRD and NFRD.
  • Data Analysis and Data model.
  • Identify the data sources and analysis on critical data elements.
  • Business-data analysis on Asset Management platforms such as AFTP, LASR, CDM, SFDC
  • Data profile to identify the data anomalies and bridge the sources with data correction mechanism.
  • Work with data stewards on data gaps and leverage the data flow.
  • Design and document the Enterprise level logical model using Erwin data model tool.
  • Design the data model conceptual, logical and physical data model for OLTP/OLAP and object oriented data structure.
  • Architect Cloud to Cloud and Cloud to on premise integrations via Informatica and mule-soft.
  • ETL development using Informatica and Mule soft.
  • Hands on development on ETL via Informatica power center, Cloud integration, and Single sign on (SAML).
  • Develop ETL mapping specifications for loading information into the data/cloud clusters and for ensuring reliability of information loaded.
  • Design/Enhance the Sales Data Model for addition of New Entities and build Relational Subject areas.
  • Design and develop the complex SQL logics to read the data from various sources.
  • Design the SPM models in Anaplan SAAS platform.

Environment: Informatica Power Center v 10.X, ICS, IDQ 9.x, Mule soft 4.x, Anaplan, Jira.

Confidential, Madison, WI

Data Warehouse Developer


  • Techno-Functional role in business process.
  • Knowledge on KIDS system.
  • Knowledge on Child support Business process.
  • Data Analysis and ETL architecture.
  • Analysis various modes of data life cycle.
  • Design the data profile standards to investigate the possible source data within relevant sources to support the enterprise wide integration.
  • Support to data driven analytics to perform qualitative research for biz stakeholders, current state analysis, reporting (ad-hoc and regular) and future state/What if scenarios.
  • Design and implement standards by working with stakeholders and data stewards to develop and track data quality and integrity measures using reporting tools and dashboards.
  • ETL development using Informatica.
  • Design the data model as in to support OLAP.
  • Parsing high-level design spec to simple ETL coding and mapping standards.
  • End-to-end ETL development of the Premium Module Data Warehouse.
  • Lead junior (Dev1 Resources) in a team of 15 to track the development and prototype the complex Informatica mapping/mapplets.
  • Design and develop the complex SQL logics to read the data from various sources.
  • Design and develop the landing, staging and warehouse for child support.
  • Design and develop store procedures for warehouse loading via index manipulation, partition strategy and database trigger theory.
  • Designed, developed and Implemented Informatica mappings to capture the Audit Queries that run to audit columns associated to the data columns in Oracle/DB2.
  • Designed Control-M jobs to run Batches for execution of Informatica Workflows.
  • Performance tuning and bottle neck resolutions on both database and Informatica tool.
  • Report development using SAP Business Objects and Tableau.
  • Created standard and Ad hoc reports using Business Objects.
  • Developed WEBI reports for customer needs to report on predicative analysis.
  • Assist on Universe Build.
  • Data Lakes for OLTP/OLAP.
  • Worked on architect of OLTP and POC on architecture of OLAP in data lakes.
  • POC’s on RDBMS-warehouse soft and hard deletes.

Environment: Informatica Power Center v 10.X, DB2, Oracle, Control-M, Business Objects and Tableau.

Confidential, Madison, WI

Data/ETL Analyst - Consultant


  • Techno-Functional role in business process.
  • Knowledge on Guide-wire and ICS platform integration.
  • Guide-wire policy and claims business process.
  • Story card and gap analysis.
  • Data Analysis and ETL process.
  • ETL mappings for data mapping from ICS platform to Guide-wire platform.
  • Design and develop the pre-stating and staging layers for data conversion.
  • Draft and execution of stored procedures to pull the data from non-elementary storage units to match the transformation for conversion impact.
  • Ping Policy webservers using Informatica power center to match each claim to corresponding policy service.
  • Worked on sourcing web services data into data warehouse platforms.
  • Created data lakes from the data coming from salesforce applications and front end UI
  • Designed, developed and Implemented Health checks using Informatica mappings to capture the Audit Queries that run to audit columns associated to the data columns in Oracle.
  • Designed Auto-sys jobs to run Batches for execution of Informatica Workflows.
  • Performance tuning for the Metadata batch process to optimize the data load timings.
  • Usage of Deployment groups in code migration from lower to higher environments.
  • Off-shore co-ordination to manage day-to day development activities.

Environment: Informatica Power Center v 9.X, DB2, Oracle, Auto-sys, Clear-Case, Hudson, Soap UI, Web Services

Confidential, San Ramon, CA

ETL Data Ware house Lead - Consultant


  • Techno-Functional role in business process.
  • Wealth management knowledge on PCS, BWIS customer services.
  • KYC/KYA understanding and application process.
  • Adept with Pershing process management.
  • Data model design and development.
  • Design the data modules in Bill Inman’s DWH process.
  • Data modelling using Erwin Tool for Star and Snowflake schema.
  • Normalization (1NF, 2NF, 3NF etc.)
  • Data captured as SCD’s in EDW need to be further SCD’s into WDM.
  • Data model design and development.
  • Data analysis and data development.
  • Design and develop the data profiling standards to master data using Oracle Warehouse builder.
  • Assist in data mining master data using Oracle data mining.
  • Design and develop data quality standards using Informatica data quality IDQ.
  • Perform data virtualization using Denodo.
  • Assist data archival using Oracle exchange.
  • ETL development using Informatica and Report writer.
  • Parse high-level design spec to simple ETL coding and mapping standards.
  • Implement ETL to form dimensions and facts to constitute a Data-Mart.
  • Develop ETL code on Informatica Power Center tool - Source Analyzer, Data warehousing designer, Mapping & mapplet Designer.
  • Solve complex data in/data out logics for the ETL process.
  • Developed multiple Dim’s and Facts associated to Subject Areas.
  • Usage of shell scripting to buckle up the batch jobs in sync.
  • Performance tuning for the Metadata batch process to optimize the data load timings.
  • Push Down optimization ETL to process the loads without Database congestion.
  • Maintain warehouse metadata, naming standards and warehouse standards for future application development.
  • Use of Report-Writer to route the files to the bank specific server.
  • Design and develop UNIX scripts for the File checks/audits once file received from vendor.
  • Design, develop and Implement health checks using Informatica mappings to capture the audit queries that run to audit columns associated to the data columns in Oracle.
  • Usage of Shell scripting to adhere the business logics check in the files received by vendor.
  • Usage of Python to SFTP the files to external vendors and audit and compliance.
  • Reporting layer developments in Tableau and Business Objects.
  • Prospect customer reports development and extraction from Business Objects.
  • POC’s for the advanced and predicative analytics.
  • Big data mining in hive databases.
  • Analytical differences between RDBMS vs HDFS using Informatica cloud (ICS).
  • Query Hive Databases for Hadoop file systems in Big Data.
  • ETL extraction from OLTP to EDW through Talend.
  • Data integration from Staging to EDW via Talend.

Environment: Informatica Power Center v 10.X/9.X, Informatica cloud (ICS), Report Writer, Talend,SQL server 2012, Tidal, SVN, Oracle warehouse builder 11g, IDQ 9.x, Denodo 6.x, Oracle EXCHANGE,HDFS.

Confidential, Brea, CA

BI Lead - Consultant


  • Techno-Functional role in business process.
  • Guide-wire platform domain knowledge on policy, billing, claims center.
  • Business process and vendor management on data feeds and third party vendors.
  • ALM on auto, homeowners and p&c.
  • Develop business requirement document (BRD).
  • Convert business requirements to functional requirements (BRD to FRD).
  • Data Analysis on the transactional systems for downstream analytical model.
  • Work with data-stewards in identifying the anomalies in data and correction mechanism.
  • Gap analysis on CDE’s (critical data elements).
  • Performed pre-trade and post trade Transaction cost analysis for Claimant vs Recovery Reserves in Insurance domain to determine the opportunity for subrogation.
  • Design of the data model.
  • Design the data mart in Ralf Kimball Paradigm.
  • Identified the parent and child relationships and hierarchy of data flow.
  • Design and in corporate the normalization namely 1NF, 2NF, 3NF techniques into data models.
  • Lead BI role on both ETL and reporting layers.
  • Understanding the requirements and prepare estimates for the tasks.
  • Prepare development plan on onsite and offshore model.
  • Development of ETL mapping prototypes for team to follow.
  • Coordinating the work hand offs between offshore and onsite.
  • Change Management and ALM.
  • ETL architect and develop the mappings for enterprise data warehouse and data marts.
  • Design and develop the ETL environment in Informatica.
  • Design and develop the XML extraction using ETL tool TALEND.
  • Setting job dependencies and workflow methodologies.
  • Architect the data flow on dimensions and facts.
  • Lead development role on ETL tools such as Informatica, Talend.
  • Develop Informatica mappings according to Functional specification to extract data to ODW and ESDW from ODS sources.
  • Develop Informatica workflows and perform the unit testing for the developed mappings.
  • Design shell scripts for the source triggering events on various sources.
  • Contribute the development of Perl scripts for reconciliation between source and target.
  • Generated ETL workflows that source ODS and generate external file that is transferred to vendor via FTP.
  • Schedule the Data-feed jobs through Tivoli to send files to external vendors
  • ETL repository management.
  • Design Code retrofit mechanism.
  • Used Hudson, a Dynamic Tool to move Deployment groups across Repositories.
  • Migration activity via Deployment Groups.
  • Performance Tuning
  • Create materialized views for complex user queries and rebuild them after every ETL load.
  • Develop data archival scripts using oracle’s exchange partition to archive the fact data from the fact tables to a different schema.
  • Develop multiple database monitoring scripts that runs periodically and sends out an email when there is an issue. (Sample scripts: temp space usage reports, table space usage reports, library lock details etc.)
  • Application production support
  • Monitor and Work on ticket resolutions.
  • On Call production support.
  • Providing ongoing support for service view applications at agreed service levels.
  • Ensure application changes are fully documented, supportable.
  • Identifying opportunities for change within the production environment.

Environment: Informatica Power Center v 9.X, Netezza, SQL server 2012, Talend, MSTR, Control-M, Clear Case, Hudson.

Confidential, Oklahoma City, OK

Data Analyst


  • Assist gather the business requirements and conversion of BRD to FRD.
  • Channelize the data governance and data standards to the downstream groups.
  • Proto type data chain mechanism.
  • Data Analysis on integration patterns across multiple subject areas.
  • Data mapping for module integration on resource management and payroll management.
  • Data mining on the anomalies pertaining to the entity-entity relationships.

Environment: PL/SQL, Oracle, Visio, Ultra edit, SAP Field Glass.

Hire Now