10+ of Strong experience with the client of Confidential, Confidential and Confidential working as a Big Data Engineer, Data Analsyst, Business System Analyst, Project Manager, Consultant with direct involvement in Business Analysis, System Integration, Big Data Enablement. Process Automation, Digital Integration. Strong experience of Data Lake, Business Intelligence, Hadoop, big data and Data Warehousing Concepts with an emphasis on ETL and SDLC, Linux. Developed data integration strategies for data flow between disparate source systems and Big Data enabled, Enterprise Data Lake. Extensive Experience in designing solutions to process high volume data stream ingestion, processing and low latency data provisioning using Hadoop Ecosystems Hive, Pig, Scoop and Kafka, Python, Spark, Scala, NoSql, Nifi, Druid.
Big Data Engineer - Big Data Innovation
- Developed Spark -Scala based Analytics and Reporting platform for the Confidential and Fido Customer Cross Channel Analytics with daily incremental data upload.
- Implemented a batch process to load the heavy volume data loading using Apachi Dataflow framework using Nifi in Agile development methodology.
- Implemented Data Lake to consolidate data from multiple source databases such as Exadata, Teradata using Hadoop stack technologies SQOOP, HIVE/HQL.
- Developed real-time streaming applications integrated with Kafka and Nifi to handle large volume and velocity data streams in a scalable, reliable and fault tolerant manner for Confidential Campaign management analytics.
- Developed Scala scripts, UDFs using both Data frames/SQL and RDD/MapReduce in Spark for Data Aggregation, queries and writing data back into HDFS system through Sqoop;
- Implemented ETL jobs using Nifi to import from multiple databases such as Exadata, Teradata, MS-SQL to HDFS for Business Intelligence (Micro Strategy and SAS), visualization and user report
- Developed complex integration of external sources such as Google API, Salesforce API, Environics to land the data into Hadoop platform using different data ingestion tools such as SQOOP, Nifi, Informatica BDM.
- Working on implementing 360 degree customer profile data mart with data ingested from 20+ sources both from internal and external sources such as AWS, GOOGLE API, Environics API, Salesforce API.
- Designed and implemented big data ingestion pipelines to ingest multi TB data from various datasource using Kafka, Spark streaming including data quality checks, transformation, and stored as efficient storage formats Performing data wrangling on Multi-Terabyte datasets from various data sources for a variety of downstream purposes such as analytics using PySpark.
- Designed and implemented Big Data analytics platform for handling data ingestion, compression, transformation and analysis of 30+ Internal and external sources.
- Designed highly efficient data model for optimizing large-scale queries utilizing Hive complex datatypes and Parquet file format.
Technologies Used: Scala, Hive, Druid, Unix shell, Apache Spark 2, Spark Streaming, Nifi, Kafka, Hortonworks 2.2, Docker, Atlas, Apache Ranger, Spring, Spring Boot, Druid, Jira, confluence, Google Cloud, GITHUB, SourceTree, Eclipse IDE, Intellij IDE, Maven, SBT, MicroStrategy, SAS, Informatica BDM
Senior Data Analyst - Digital Integrations
- Performing Business Analysis, Gather, Analyze and develop detailed Solution Design Document (SDS), Business Requirements Document (BRD), System Requirements Specifications (SRS), Report requirements, Interface requirements & RTM for the Confidential Digital Integration Projects.
- Collaborate with business teams for the implementation of Confidential Enterprise integration projects that involves Digital channels like Confidential .com, Fido.Ca with the different backend BSS systems Maestro, Super Systems, CRM Oracle EBS, Amdocs Billing System Etc.
- Prepared Digital Integration solution for the Confidential ’s new programs such as Hup PayatTill, Enroute, Fido Usage Enhancement, Invoice Evolution, Confidential Home Phone 1.5, WCoC Postpaid and Roaming, Next Best Action and offers, EPON.
- Gathered business and functional requirements for the new SOAP and REST based web services developed in the Oracle SOA, ESB platforms using the Enterprise architect, XML SPY, SOAP UI, and Eclipse IDE.
- Implemented system enhancements and problems, performs System analysis and design of various web interfaces for the customer facing web applications Confidential .com, Fido.ca;
- Developed data integration strategies for data flow between disparate source systems and Big Data enabled
- Enterprise Data Lake.
- Perform eddata profiling, modeling and Meta-Data Management tasks on complex data integration scenarios adhering to Enterprise Data governance and Data Integration standards using Apache Atlas.
- Designed solutions to process high volume data stream ingestion, processing and low latency data provisioning using Hadoop Ecosystems Hive, Pig, Scoop and Kafka, Python, Spark, Scala, NoSql, Nifi, Druid
- Prepared Interface Design document for the mapping the different fields from source to target systems.
- Effectively leverage continuous integration, continuous development and continuous deployment agile and DevOps tools and processes to deliver and support advanced Data Science and Big Data solutions and services, including Git, Jira, Jenkins and others as required
- Designed and developed Web Services (SOAP) client using AXIS to send service requests to Webservices. Invoked Web Services from the application to get data from multiple sources.
Technology used: File Based, JMS Queues,,Web Services (SOAP, HTTP GET & POST Binding/ REST) & SOAP API, Asynchronous/ Synchronous & Fire and Forget Integration Patterns, Cloud App Integrations, SOA, Enterprise Service Bus (ESB), Jira, confluence
- As the SME, lead the Launch of Warranty and Supply chain applications ( such as Outbound Logistics Controls, One Warranty Solution, Automatic Retrieval and Storage System, Smart Autocall) for the US and Canada Market.
- Coordinated effectively at all levels of business sponsors, business groups, technology teams, to provide critical directions and ensured managing smooth development, testing and implementation of Data Integration platforms.
- Analyzed the business requirements and data, designed, developed and implemented highly effectively, highly scalable ETL processes for a fast, scalable data warehouses.
- Research and evaluate technical solutions including various Hadoop distributions, NoSQL databases, data integration and analytical tools with a focus on enterprise deployment capabilities like security, scalability, reliability, maintainability, etc.;
- Advise and support project teams (project managers, architects, business analysts, and developers) on tools, technology, and methodology related to the design and development of Big Data solutions;
- Managed and developed new data processes to enhance existing systems and support new requirements;
- Program and implement logical and physical modeling practices on various data platforms to cleanly integrate into existing enterprise data models; executing data model components to achieve efficient storage utilization and best query performance;
- Gather user interaction and requirements from internal customers and consult on best practices to effectively use Big Data platforms as a data and computing resource;
- Maintain knowledge of market trends and developments in analytics software, and related and emerging technologies like cloud hosting services, Agile/DevOps development processes to provide, recommend, and deliver best practice solutions;
Technology used: Primeavera, Clarity, SQL Server, Oracle 11g, HTML, XML, /XSD, XSL/CSS, Web Services, SOAP, TCP/IP, Microsoft BizTalk, Application Framework, Oracle EBS integration, Oracle Data Integrator. Oracle MFT
Project Lead/ IT Senior System Analyst
- Led the end to end implementation of production critical applications for Ford manufacturing and Supply chain operations.
- Functioned as Sr. Business Analyst for the projects involving integration of ERP with manufacturing applications.
- Coordinated effectively at all levels of business sponsors, business groups, technology teams, to provide critical directions, and ensured managing smooth development, testing and implementation of complex applications.
- Organized, participated and governed the requirements sessions involving stakeholders from sponsors, business, operations groups and technology teams, and provided guidance to detail out scoped requirements, Influenced and negotiated effectively for harmonized conclusions on scope creeps
- Ensured data requirements (data mapping, master data, transactional data, meta data, and incremental data cuts) were well understood at all levels of business, operations and technology teams, for robust data migration needs
- Organized for UAT and user group training activities, detailed business flow and limitations of PFS System to users, as well managed releases, functional pre-production testing of applications, tracked and managed the issues until closure, user group support for UAT.
- Prepared the project proposals, including scope analysis, Future state, cost estimates, terms of reference and cost/benefit studies for the new applications.
Technology used: TOAD, SQL plus, SQL Loader, SOAP UI, Notepad++, BeyondCompare, XMLSPY, MS Visio
MS Project, Oracle EBS, IT Data Analyst - Oracle EBS
- Involved in the Business Transformation Program, migration of Region specific Business Process into Oracle Ebiz R11 ERP instance with optimized Standard processes for India Manufacturing facility
- Conducted in-depth workshops to elicit review and confirm business requirements, business process reengineering, data and workflows, business models, business rules and user interface design.
- Liaison with business owners to understand business requirements and prepared the high level process flow diagram using MS Visio
- Prepared the Functional Specification and Design document including all the components that impact the ERP system
- Managed the development of custom application enhancements and business user testing/sign-off.
- Developed Project status tools such as Clarity, Primavera and methodologies SDM to ensure quality solution delivery
- Played a Service Delivery lead Level 1 / 2 business systems support for ASIA Pacific and EMEA ERP application
- Managed projects involving virtual teams distributed across geography, Project Initiation, Execution and Closure activities, SDLC Knowledge, Effort estimation methodologies and quality processes.
Technology used: Oracle EBS (SCM), Oracle Financials, TOAD, I2 MRP, SQL plus, SQL Loader, MS Visio, MS Project, Clearcase, Kintana, BMC Remedy