List of Analytics Companies in Chennai, Hyderabad, Vizag, Cochin, Puducherry

Accenture – Chennai


Management Consulting, outsourcing, advanced applications across various industry verticals – Analytics

Cognizant (GDC) – Chennai, Cochin

Big data analytics, customer analytics, risk, operation analytics – consulting

Datamatics – Chennai, Puducherry

Financial, healthcare insurance, retail, manufacturing

Global Analytics – Chennai

HCL Technologies – Chennai

Advisory/ Support – Business Analytics, BI, Big Data

Latent View – Chennai

Analytics consulting, Big Data

McKinsey Knowledge Center (McKC) – Chennai

Retail, Research & Analytics

Nabler – Hyderabad

Digital Analytics / eCommerce

Pricewaterhouse Coopers (PwC) – Chennai

Big Data, Analytics

Tata Consulting services Delivery centre – Chennai, Cochin

Analytics services across various industries

Pre-built Analytics platform solutions & BI

WNS Analytics  – Chennai, Vizag, Hyderabad

Marketing, Consumer Behavior Analytics

URGENT – 2 Openings in Kolkata at Large Analytics Consulting Company – Interviews on 22 Aug, Saturday

Job Openings!

Two openings in a Large Analytics Consulting company in Kolkata. Exclusively for Ivy’s Analytics students. The details are given below:

  1. Location: Kolkata
  2. Openings (two separate openings): 
    1. Predictive Modeling – Knowledge of Linear, Logistic, Time Series (ARIMA), and other basic modeling techniques. Hands-on knowledge of SAS / R is a plus. Candidates with project experience would be given preference.
    2. Reporting – Hands-on knowledge of Adv Excel, VBA, SQL.
  3. Interview Date: 9:00 AM to 2:00 PM on 22nd Aug 2015 (Saturday)
  4. Company and Venue: Unitech, Rajarhat. For company details, please call 9748441111 or email to
  5. Deadline to apply: 21st Aug 2015, 4:45 PM
  6. How to Apply: Please share the following details by 4:45 PM on 21st Aug 2015 to
      • Pan Card Number (Mandatory), Date of Birth (Mandatory), First Name, Surname, Gender, Mobile, Email
    • Please send the details to by 4:45 PM on 21st Aug 2015
  7. Must Carry: Government ID card (Mandatory) and a few resume copy with you for the interview.

All the information requested above are important.

Career Center | Ivy Professional School


Common Statistical Models used in Predictive Analytics

In a previous blog, we covered the use of predictive modelling techniques to predict future outcomes. The techniques used differ for various applications. However, there are some fundamental statistical techniques, mathematical algorithms and neural network systems used in predictive modeling.

Model bldg

Model Building for Forecasting

Statistical techniques and tools in use

  • Linear regression
  • Logistic regression
  • Cluster analysis
  • Analysis of variation (ANOVA)
  • Chi-squared test
  • Correlation
  • Factor analysis
  • Association rules
  • Decision trees
  • Time series
  • Experimental design
  • Bayesian theory – Naïve Bayes classifier
  • Sampling
  • Matrix operations
  • K-nearest neighbor algorithm (k-NN)
  • Pearson’s r

Commonly used Statistical models

Logistic Regression:

Logistic regression models the relation between a dependent and two or more independent variables (explanatory and response variables). It takes a look at how significant the relationship is between the variables. The probability (p) that event “1” occurs rather than event “2”. Where a good fit of the model is obtained, you can plug in the independent variable values for a new observation and predict if the dependent value will be 0 or 1.


Banks – for building scorecards of customers applying for loans. The loan officer identifies characteristics that indicate probability of loan default, and further use this to build a scorecard of good and bad credit risks. Data of past, current and potential customers are used to execute a Logistic Regression Model. The model is leveraged to classify potential customers who have applied for loan, as good or bad credit risks. This uses binary logistic as the ‘dependent’ variable is dichotomous (loan default OR no default).

Education institutions – An engineering college would estimate enrolments of fresh students to determine cut-off marks and freeze admissions. A multiple logistic regression model is used to factor Class10, Class 12 and related AIJEE scores, distance from college, demographic information including stream preferences, historical data of student enrolments, to calculate probability of enrollment. The estimated model has to fit the data adequately to show the significance. Calculations can also be made to estimate the effect of how a single independent variable affects the likelihood of application.

Time Series:

The Time Series forecasting model is used to make predictions of future values based on previously observed / historical values. The two main goals are the identification of the phenomenon represented by the sequence of observations, and the forecasting of future values in the time series variable. The pattern of observed time series data is identified, described and integrated with other data. The identified pattern is further extrapolated to predict future events.

Model eg

Time Series predictive models are used to make forecasts where the temporal dimension is critical to the analysis. Typical application scenarios are demand prediction of a product during a particular month / period, estimation of inventory costs, forecast of train passengers for the next financial year, and so on.


Clusters in the data are used for modelling predictions by grouping ‘like’ objects for a probability distribution. A model is hypothesized for each of the clusters to find the best fit of that model to each other. Clusters in customer behaviour may be used for predictive modeling, i.e. behavioural clustering, to predict behaviour or buying patterns of customers. Clusters in product segmentation may be used to predict what different categories of products customers are likely to buy. Algorithms auto-segment the objects based on several variables, to devise the cluster DNA. This is then leveraged for predictive insights.

Cluster models are used to predict demand of products (customer ordering baby clothes is likely to order diapers), brand preferences, predict efficacy of drug amongst a certain age group in clinical trials, predict stock market trends, identify groups of car insurance policy holders with a higher average claim cost, and more.

Decision Trees: This statistical technique is a tree-like predictive model of decisions and possible consequences. Based on Boolean tests, specific facts are used to make general conclusions / decision points represented by nodes. Rules trace the series of paths from root to nodes, till an action is derived. Problems are structured as a tree with end nodes as branches, representing a specific event or scenario, or subject probability.

decision tree


A basic Decision Tree Modeling graph to predict how many buy ice cream because they crave for it, even if they don’t have extra money.

Personalisation – the mantra of Digital Marketing and eCommerce

From being an obscure backstage performer to emerging as the hottest technology-in-use, personalization has come a long way. Deployed across multiple platforms, devices and on-site sensors, used by small digital marketers to ecommerce giants like Amazon, personalization has become the mantra of every business performing in the digital space.

While Google may have kick-started the process of the personalized algorithm with its tailored searches, Facebook gave it a face lift with the “completely personalized newspaper”.  Yet, it is digital commerce which has aggressively leveraged varying degrees of personalization for a choreographed staging of consumer activity, even before the consumer has scripted it! Yes, a user may have already bought furniture and home appliances for his new home, but is nevertheless tempted by ads and promo mails to buy an exhaust fan for the kitchen that was not included in his budget.

In the quest for engaging a customer for a longer CLV, satisfactory experience and higher conversions, user activity is closely monitored for a 360 degree view. A compelling need for hi-performance branding and marketing thus fuels a constant endeavor to devise the best-fit personalized algorithms.

What are the common technologies in use?

Personalization connects technologies and algorithms to data stored in various repositories for a meaningful connection to the customer profile. Customer behaviour, browsing patterns, transaction history, etc., are used to create an experience tailored to the customer.

  • Cookies
  • Collaborative filtering
  • User profiling
  • Data analysis tools
  • A/B Testing
  • Neural network
  • Bayesian network
  • Rules based engines
  • Vendor driven technologies

Businesses like insurance and banking, which require a granular personalization algorithm to keep ahead of competition, have the capacity to invest in sophisticated technologies. Nevertheless, they too are deploying another method for successful implementation – that of Artificial Intelligence – where data is mined for insights driven delivery of personalized advice. The key to building an effective ROI while extracting the most value from the personalized algorithm, is to build a ‘lean’ personalization approach. .

How to make sure the same user comes back soon to make more transactions?

By adopting an intuitive approach to solve the problem in the following ways:

  • Using experimentation, identify models that need to be refined or ‘tweaked’.
  • Leverage guesswork within a tested / structured framework
  • Exposing a trial or experimental personalization to a small sample of members, for a sense of its viability while limiting the cost of errors
  • Feedback confined to staff, to understand the ‘whys’ behind product recommendation
  • For improving brand mix or category cadence in a targeted or personalized sale
  • Human experience and cognitive thinking (visitor who buys   is most likely
  • A time –tested mechanism for recommendation arsenal
  • Isolating the trial to prevent losses from an error
  • User testing   to prevent costly errors and
  • Employee only release feedback
  • Human experience
  • User testing – trial and error
  • Use of artificial intelligence
  • Expert systems

Regardless of the approach used, the end-all is to keep improvising the techniques and algorithm design. Improving the quality of data, creating a time-honoured “test and learn cycle” mechanism with targeted segment, and incorporating the same into the system, are the Holy Grail of personalization.

List of Analytics Companies in Bengaluru

Absolute Data

Data science, Decisions engineering


Management Consulting, outsourcing, advanced applications across various industry verticals – Analytics

Affline Analytics

Retail, eCommerce, Banking, Financial

Blue Star Infotech Business Intelligence and Analytics Pvt. Ltd (formerly Activecubes Solutions )

BI & Analytics

BRIDGEi2i Analytics Solutions

Marketing, Risk, Supply Chain, Sales


Retail, Customer, Mobile Social

Cognizant (Global Delivery Centre)

Big data analytics, customer analytics, risk, operation analytics – consulting


Clinical & Pharma Analytics


Financial, healthcare insurance, retail, manufacturing

Delloitte Analytics

Risk, Fraud, HR, eCommerce,


Customer, eCommerce

Fintelix (formerly iCreate)


Fractal Analytics

Retail, Finance, Big Data


Analytics, Research – outsourcing, consulting

HCL Technologies

Advisory/ Support – Business Analytics, BI, Big Data


Healthcare, Marketing, Financial, FMCG, Travel, Retail, Auto

Market Intelligence

BI & Digital Analytics – CPG, Finance, telecom


Financial, Lending, Collections, Recovery, Banking


Decision Science and Analytics


Digital Analytics / eCommerce

Pricewaterhouse Coopers (PwC)

Big Data, Analytics


WNS Analytics

Marketing, Consumer Behavior Analytics


URGENT – Opening in Kolkata as Pricing Analyst

  1. Location: Kolkata
  2. Role: Pricing Analyst
  3. Shift Timing: 12:00 PM to 9:00PM IST
  4. Job Responsibility:
    1. Optimal Price Point Analysis for New/ Modified repairs and Price bundling.
    2. Value based Pricing as per Engine Life cycle stages
    3. Bi-annual Analysis of Sales data trends, Competitor assessments and Customer Segmentation
    4. Margin optimization and Price Catalog Escalation Strategy analysis
  5. Skills Required:
    1. Hands on experience on Price Optimization and familiarity with Product Life cycle analysis.
    2. Working knowledge on MS-Excel and MS-PowerPoint
    3. Good Written and Oral Communication skills are must
    4. Should have Quantitative background (Statistics/ Economics/ MBA- Marketing), Strong
      Mathematical Aptitude is desired.
    5. Working knowledge on VBA and Excel Macros
    6. Familiarity with Data Management and Model Optimization would be an added advantage.
  6. Please send your resume to before 5 pm 26 July 2015



Cloud Analytics and BI – the latest trend in enterprise analytics

With most businesses moving part of their operations to the cloud, analytics and BI in the cloud have emerged as the new frontiers of the service model. From IBM, Oracle, Hewlett-Packard and Salesforce , to start-ups like Ideal Analytics, Cloud Analytics is the latest trend in the delivery of enterprise analytics.

Today, apps and data are increasingly being deployed to the cloud. This has led to cloud analytics and business intelligence revolutionising data analysis and reporting in a totally new avatar.

A short time-to-delivery for analytics, improved agility and an effective ROI are the key drivers for the phenomenal increase in the adoption of cloud analytics by large enterprises and smaller players.

The success of Cloud BI and analytics has also given rise to many data visualisation tools and cloud suites for BI, data warehousing and analytics. For instance the Amazon Web Service, Alteryx, IBM Watson Analytics, the Boz Allen cloud analytics architecture or the SAP Big Data Analytics platform – all of which offer varying degrees of data storage, analysis and reporting.

Analytics in the Cloud

Where any of the following elements are fulfilled:

  • data stored or sourced in the cloud,
  • cloud based data modelling
  • cloud processing
  • cloud computing
  • analytic modeling, sharing or storage

How does it work?

Cloud analytics services are either subscription-based or pay-per-use.  Services and applications include Data warehousing and harvesting, SaaS (Software-as-a-service) or hybrid service, BI on demand or analytics as a service. Hosted services are made available to the enterprise / multiple users by the service provider. Off-site storage and harvesting of data cuts down costs and improves access. So data can be retrieved and analysis done, when and where required.  The cloud can also be deployed as platform-as-a-service for data analytics.

Advantages – Multiple pricing options, minimal hardware and infrastructure, speedy reporting and analysis environment, cost-effective, analytics on the go in real-time, minimum outlay, sophisticated dashboards and interactive visualization, intuitive reporting for non-technical users, in-memory analysis, integration of data from multiple sources, mobile BI, multi-tenancy support, audit logging and OLAP functionalities, free technical support, multiple deployment options, various analytical models, reduced administrative and implementations costs, options of distributed cloud services, ability to connects large disparate social data, Big Data capability, and more.

The Cloud Analytics market

Cloud analytics market2

Reports indicate that the global Cloud Analytics Market is expected to grow from $5.25 billion in 2013 to $16.5 billion in 2018. at a CAGR of 25.8%.

The Banking & Financial Industry (BFSI) is expected to account for the largest market share.

The high growth markets are identified as:

Hosted warehouse – storage and management of mission critical data

Analytics – predictive, spatial, video, social media, text, web

Cloud BI – data integration, reporting, CRM

Application areas of BI and Analytics in the cloud

Healthcare, BFSI, media, social media, government, consulting, research and education, energy, telecommunication

Bottomline: As the data being handled becomes more voluminous, businesses are seeing value in moving their data analysis to the cloud. Rather than build large complex data warehouses and deploy analytics thereon, companies chose to connect with source data APIs and cloud for rapid analysis in real-time. Scalability, reduced costs of infrastructure, improved ROI and analytics on the go are revolutionizing the way analytics is being leveraged in the cloud.

Information, Career Advices, & Job Alerts on Analytics, Actuarial Science Careers |