We provide IT Staff Augmentation Services!

Informatica/etl Developer Resume

2.00/5 (Submit Your Rating)

CaliforniA

TECHNICAL SKILLS:

  • MS SQL Server Database Administrator/Developer, BI, ETL, Healthcare and Business settings, SQL Server Configuration, Replication, Virtualization.
  • Visual Studio, C#, .Net, C++,VB, SDLC, SharePoint, PowerShell scripts, REST API, Open Data Protocol MS EXCEL, Access, Visio, Oracle, Java, Cogito, Star, Radar, Workbench, REST API, Open Data Protocol, Javascript
  • SSMS, SSRS, SSAS, SAAS, SSIS, SSRS, ETL, Crystal Reports,
  • NoSQL, PLSQL, MySQL, T - SQL, SQL queries, Transact-SQL, SQL Server architecture,
  • Data Science, Data Analysis, data mining, Data Warehouse, Business Intelligence, Statistical analysis, concatenations, pivot tables, Table partitioning and archiving, Data cubes, Data marts, Big Data, JSON and XML data interchange
  • Optimize Stored Procedures, Indexing, Consistency Checks, performance tuning; SQL Server log shipping, SQL replication, scripting, Fine tune database, Function and trigger design and coding, Index implementation and maintenance, Clustering, Indexing,
  • Random Forest, Machine Learning, Pentaho, Python, PowerPivot, PowerView Matlab/R, Ruby, Ruby on Rails, Agile, Waterfall, E-Commerce, Hadoop, SSAS OLAP cubes, Pentaho, Hadoop, PIG, Hive, Spark, Oracle, Hibernate, MyBatis, Spring Data
  • Tableau Architect, PHP, SQL Server Integration and Analytics Services, Data cubes, Data Science, Data Analysis, Data Warehouse Architect, mapping
  • E-Commerce, Hadoop, Big Data, MapReduce, Allscripts, R, HBase, Data modeling, HR Analytics, Data Integration architecture
  • OLTP, OLAP, database design, performance tuning and security model implementations
  • BI and analytic tools, Business Objects, QlikView, Tableau, COGNOS, GuideWire DB
  • Agile, Waterfall, Scrum development methodology, Web Services, Hyperion, OBIEE, Informatica,
  • Healthcare, Tapestry, FACETS, Epic Clarity, HIM, Meditech, TriZetto Reporting, Eclipsys, Allscripts, Cerner, Siemens and McKesson EMR, Epic systems, HIPPA, EDI, Revenue Cycle, HEDIS, Healthcare, SOX, E H R

PROFESSIONAL EXPERIENCE:

Confidential

INFORMATICA/ETL DEVELOPER

Responsibilities:

  • Implemented business requirements, environment dependencies and integration points
  • Wrote SQL on Guidewire Database to find item, issues, claims, and reports.
  • Developed end to end architecture designs on big data solution based on variety of business use cases
  • Presented design architectures to the various stakeholders, customers, servers, Network, Security and other teams
  • Provided technical leadership and governance of the big data team and the implementation of solution architecture
  • Managed the architecture design changes due to business requirements and other interface integration changes
  • Provided an overall architect responsibilities including roadmaps, leadership, planning, technical innovation, security, IT governance, etc
  • Designed, Layout, and Deployed Hadoop clusters in the cloud using Hadoop ecosystem & open Source platforms
  • Configured and tuned production and development Hadoop environments with the various intermixing Hadoop components
  • Provided End-to-end systems implementation such as data security and privacy concerns
  • Designed and implemented geospatial big data ingestion, processing and delivery
  • Provided cloud-computing infrastructure solutions on Amazon Web Services AWS - EC2, VPCs, S3, IAM
  • Worked with Enterprise and Big Data Architecture
  • Implemented, setup and worked with Hadoop
  • Worked with Active Directory, LDAP, and Identity Management Integration
  • Hands on experience implementation big data solutions on Geospatial data
  • Expert on architecting end-to-end big data solution based on variety of business use cases
  • Lead the implementation and operationalization of enterprise wide big data solution in a cross functional and cross technology/domain environment
  • Worked on administration, configuration management, monitoring, debugging, performance tuning, technical resolution on Hadoop applications suit, Hadoop platform, MapReduce, Hive, HBase, Spark, Flume, Oozie, Tez, Ambari, Kafka, Pig, Accumulo, Storm, Falcon, Atlas, Scoop, NFS, WebFDS, Hue, Knox, Ranger, Impala, ZooKeeper
  • Worked with Oracle, SQL Server, MySQL Databases
  • Programmed using used Java, Linux, PHP, Ruby, Phyton, R, Informatica, Tableau
  • Worked with Spark, HBase, Hive, Sqoop, Oozie, Flume, Java, Scala, Pig, Python
  • Worked on multiple projects by mastering provider data; build a new dimensional Data Warehouse; consumer, employer, and provider analytics; and developing new data marts for existing data warehouses. Worked with Informatica PowerCenter, Oracle, Dimensional Data Modeling, Healthcare Payor, Data Warehouse, SSIS, ERWIN, PLSQL, Informatica MDM,IDD,Data Analyst, EPIC, Facets, TOAD, salesforce. Responsible for development and administration of Informatica ETL mappings and job schedules. Worked with BI team in design and development of Data Warehouse environment through integration of various data sources. Developed and maintained ETL load schedules. Provided operational support and troubleshooting for existing ETL jobs. Advised the BI team on Informatica and the data needs of the business.
  • Worked as ETL Informatica Developer, SAP connectors, Developed and Maintained ETL jobs and job schedules for EDW environment, provide improvements for ETL jobs performance and data quality, provided operational support and troubleshooting for ETL jobs, Designed, assessed, and developed table structures, data relationships and integration mappings for EDW environment
  • Proficient with Internet, Email,Microsoft programs, Informatica Powercenter 9, Data warehousing, Inmon Corporate Information Factory and Kimble Star Schema, Informatica Velocity methodology, Oracle RDBMS versions 10 and 11, Oracle PL/SQL
  • As an ETL Developer/BI Data Warehouse Architect;
  • Reviewed and analyzed existing and new Data Analytics warehouse environments and other external sources in order to design and architect a system communicating information to Data Analytic staff and clients. Worked in the implementation of Data Analytics architecture design to accept structured and unstructured datasets · Worked with the implementation of Pentaho ETL environment, including fact and dimensional models based on use case development from business analysis teams, worked in the design of Tableau/OBIEE dashboards and reporting from business analysis teams. Worked with the extraction, transformation and loading of newly identified & acquired data into a secure Data Analytics environment. Worked with CIO and business customers to define and build ETL transformations in Pentaho. Provided ongoing knowledge transfer to State resources on all work completed above. Worked with both technical and non-technical staff, provided presentations and demonstrated/communicate complex technical information to non-technical audiences, provided mentoring and knowledge transfer through hands-on training.

Confidential, California

DATA WAREHOUSE/INFORMATICA /ETL DEVELOPER

Responsibilities:

  • Worked as a data architecture, data modelling in relational and dimensional, data architecture and metadata management
  • Used data modeling tools Embarcadero/Erwin/ER Studio and other toolsets.
  • Wrote SQL on Guidewire Database to find item, issues, claims, and reports.
  • Worked with relational databases DB2, Oracle, SQL, Cassandra, MongoDB and columnar database Vertica
  • Used SQL skills, querying large, complex data sets and performance analysis.
  • Expert in ETL platforms, SQL, NoSQL and Hadoop Hive/Scala/Spark.
  • Worked in modelling, managing, scaling and performance tuning of high volume OLTP, OLAP and data warehouse environments.
  • Built Logical and Physical data models
  • Worked on Performance tuning and optimization
  • Worked on multiple database concepts RDBMS, OODB, ODS, Warehouse
  • Worked in building and migrating data to datamarts

Confidential, (Nashville)

BIG DATA ENTERPRISE ARCHITECT HADOOP/PENTAHO/INFORMATICA DEVELOPER

Responsibilities:

  • Implemented business requirements, environment dependencies and integration points
  • Wrote SQL on Guidewire Database to find item, issues, claims, and reports.
  • Developed end to end architecture designs on big data solution based on variety of business use cases
  • Presented design architectures to the various stakeholders, customers, servers, Network, Security and other teams.
  • Provided technical leadership and governance of the big data team and the implementation of solution architecture
  • Managed the architecture design changes due to business requirements and other interface integration changes
  • Provided an overall architect responsibilities including roadmaps, leadership, planning, technical innovation, security, IT governance,
  • Designed, Layout, and Deployed Hadoop clusters in the cloud using Hadoop ecosystem & open Source platforms
  • Configured and tuned production and development Hadoop environments with the various intermixing Hadoop components
  • Provided End-to-end systems implementation such as data security and privacy concerns
  • Designed and implemented geospatial big data ingestion, processing and delivery
  • Provided cloud-computing infrastructure solutions on Amazon Web Services AWS - EC2, VPCs, S3, IAM
  • Worked with Enterprise and Big Data Architecture, implemented, setup and worked with Hadoop
  • Worked with Active Directory, LDAP, and Identity Management Integration
  • Hands on experience implementation big data solutions on Geospatial data
  • Expert on architecting end-to-end big data solution based on variety of business use cases
  • Lead the implementation and operationalization of enterprise wide big data solution in a cross functional and cross technology/domain environment
  • Expert on administration, configuration management, monitoring, debugging, performance tuning, technical resolution on Hadoop applications suit, Hadoop platform, MapReduce, Hive, HBase, Spark, Flume, Oozie, Tez, Ambari, Kafka, Pig, Storm, Falcon, Atlas, Scoop, NFS, WebFDS, Hue, Knox, Ranger, Impala, and ZooKeeper
  • Worked with Oracle, SQL Server, MySQL Databases, Programming language used Java, Linux, PHP, Ruby, Phyton, R, Informatica, Tableau
  • Data Modeling working with Oracle PL/SQL, data modeling and data warehousing
  • Worked with star, snowflake schemas, indexing, aggregate tables, dimension tables, constraints, keys, and fact tables
  • Technical expertise in data models, database design development, data mining and segmentation techniques
  • Strong knowledge and experience working with ETL frameworks, reporting packages, databases, and PL/SQL programming.
  • Implemented data requirements for project and solution based client needs on how the data will be used in upstream/downstream applications and accordingly guide the team on logical physical models
  • Strong analytical skills and the ability to collect, organize, analyze, and disseminate significant large amounts of information with detail, accuracy, queries, identifying errors in data and presented findings
  • Data warehousing tools used Data Marts, Materialized Views, Star Schema, Erwin, Oracle.
  • Worked Hadoop Cloudera/HortonWorks/MapR and ecosystem components, NoSQL databases like HBase, Mongo, Cassandra
  • Worked on data gathering, designing and developing scalable big data solutions with Hadoop

Confidential, California

DATA WAREHOUSE ARCHITECT/ETL DEVELOPER

Responsibilities:

  • Worked as a Data Analyst/Scientist with experience in computer science
  • Worked in statistical modeling and the end-to-end life cycle of Data Analysis
  • Integrated data from Hadoop data storage, Big Data / Hadoop, using metrics from the large datasets
  • Transformed data into Data Visualizations for reporting
  • Developed, refined and scaled data management and analytics procedures, systems, workflows.
  • Worked in the design and development of solutions for large volumes of data
  • Used statistical programming languages, 'R' and Python
  • Responsible for creating and maintaining Customer Intelligence Analytics Data Warehouse and Data Modeling.
  • Created Source to Target Mappings and Facilitated Data Warehouse Model Reviews.
  • Responsible for Customer Data Integration (CDI) participating in Data Modeling, JADs, Data Mapping and review sessions, source to target mappings, creating Business Conceptual Models, Logical Data Models and Physical Models.
  • Facilitated Model Review and Mapping Sessions
  • Worked with various client centers and teams to identify sources of data and the associated integration requirements.
  • Developed and proposeed an overall enterprise architecture to support client objectives.
  • Developed prototypes and helped system development team implement the selected technologies.
  • Helped identify and resolve technical challenges throughout the deployment process
  • Collaborated with team, leadership, and compliance personnel to understand problems, identify solutions, and demonstrate software solutions.
  • Designed and implemented state-of-the-art analytic techniques and help design our core analytics platform.
  • Provided creative and technical skills, guide a team in developing cutting edge machine learning algorithms using Java.
  • Developed highly-distributed algorithms in Big Data using Hadoop, Spark, MongoDB and graph databases.
  • Worked in the design, development, implementation, test, documentation, and support by using standard established software engineering principles

We'd love your feedback!