A concise review of customer history and other relevant information points
As the voice and sometimes face of a company, Contact Centre Agents must deal with customer inquiries efficiently and professionally. Pressures on agents include throughput-based service level goals as well as the need to ensure a top-quality service and sales experiences to customers at every point. Maintaining this quality and efficiency requires the provision of key customer history data to agents during calls without requiring agents to engage in time consuming searches across interfaces. Unfortunately the provision of such customer and case histories to the agent is far from a trivial task. Customer history summarisation is made difficult not only by enterprise-wide information integration challenges, but also by the computationally demanding task of determining the most relevant information that can be provided to the agent in bite-sized chunks.
Encouraging the greatest level of change in customer or user behaviour
Providing analytics-driven insight does not guarantee that people will take note of it and change their decision-making behaviours. For example, did a customer identified as a churn risk respond to the intervention recommended; or did loan officers in a bank actually base their lending decisions on the output of risk models? Measuring the success of any analytics-driven project, therefore, requires looking beyond the insights that are produced to ways in which these insights are communicated and delivered to ensure that behavioural change takes place. This research will benefit any company that wishes to make sure that the application of analytics results produces the required or expected outcome. For example, did a customer identified by a supermarket as a user of baby products take advantage of a promotion on nappies that was on offer that week? If not, can a message be tailored for that customer to encourage them to take advantage of such a promotion the next time? Using tailored messages, our research is working on encouraging the greatest level of change in a customer/user’s behaviour .
Customer segmentation for a broad user base.
With the rise of service customisation, data analytics is becoming increasingly important so as to understand the needs of a company's customers. In order for the potential of data analytics to be fully realised, analytics tools must be developed for use by the widest user-base possible. Under the CeADAR Intelligent Analytic Interfaces: Ease of Interaction theme, smart analytics tools are being developed to aid non-analytics specialist users in exploring datasets and performing analytics tasks. The first task selected for focus under this theme is customer segmentation, an especially common data analytics task and one that organisations tend to perform when they first start to use data analytics. Presenting segmentation results to the user in a way that is interpretable is the key to the success of SmartSeg. To understand customer segmentations analytics practitioners compare the distributions of features within a segment to the distributions of the same features in the underlying full population. While major existing analytics tools have enabled widespread adoption of data analytics, they remain mostly targeted at expert analytics users. At CeADAR we are developing smart analytics tools for non-analytics experts - the so called next billion users.
Observing Key Topics From Multi-Channel Audio Streams
There is a large quantity of information communicated through speech audio streams (phone calls, broadcasts, commentary etc.) across many industries. In many instances this information is of potentially high value but is discarded for lack of efficient and accurate information retrieval strategies. Speech recognition technologies partially solve this problem but the output texts are still high volume, noisy and difficult to interpret and summarize.
Continuous Data Stream Analytics for Image Data
Object detection, especially small object detection, is a well known and studied image analysis challenge. An example of which is finding predefined brands/logos in a large number of images or a continuous image stream in real-time. Current logo detection applications are generally run in an offline way, and cannot handle a large, continuous image data stream and process it in real time or near real-time.
Identifying Persons of Interest in Social Networks
Finding persons of interest on social media is not a trivial task and current approaches to people search are simplistic and based on collaborative filtering. There is a growing need to find people in a particular area of expertise, location, and/or business. IdentityMatch provides near real time retrieval of results from user searches.
Personalised forecasting model for more accurate predictions
Different areas of the economy require precise forecasts of future events based on knowledge collected from the past.Technology platforms which are accurate and effective in acquiring and processing such information are of great value to the market because they assist in planning and helping to prepare for what is coming. Unlike typical forecasting which can be based on informal methods, optimal forecasting methodology can be conducted based on scientific tools to assure proper outputs. As a result, appropriate feedback on a given problem can successfully warn if the process under investigation is heading towards an undesired direction. This result, in turn, can support managers with meaningful information that allows them to improve planning. An automated forecasting platform can result in trustworthy analytics which leads to better decision making processes. Successful forecasting can minimize risk which is associated with the unknown future, and if properly utilized may give rise to the profits which would not be achievable otherwise.
High-throughput, Scalable Data Stream Clustering
High volume, high throughput data streams are common in many industries including financial services (transaction streams), communications (instant messaging, SMS, micro-blogging), web and gaming (action and event streams) and production line environments (machine generated data). The ability to analyse and gain insights from this type of data as events happen, in real-time, can be hugely beneficial. Traditionally data analytics is performed as an off-line, batch process where the results are available hours or even days after the data was produced. This means that any actions taken based on these insights will be at a considerable time interval after the original events occurred, and in many scenarios being able to analyse the live data stream and hence reduce this response delay is of critical importance. Clustering is a core data analytics technique whereby similar entities are automatically identified and grouped together. This drives many common applications of data analytics such as detecting anomalous or fraudulent activity, identifyting market segments and user behaviours, reporting spam and emerging topics and patterns. CeADAR has developed a high-throughput, scalable clustering solution for data streams that brings real-time, ‘live data’ capabilities to these advanced data analytics tasks.
Advanced monitoring of entities in online media sources
In the always-connected 24-7 online marketplace, up-to-date knowledge is paramount for good decision making. In many timezones across the globe, content is emerging which can have an effect on the reputation of a firm, individual or product. Depending on the trust level attached to the source, a single social media message can cause significant damage to the reputation of an entity, as experienced when a tweet was sent from the hacked Associated Press Twitter account, causing the Dow Jones Index on Wall Street to drop sharply. Online monitoring of the reputation of an entity enables quick reactions to current events or market fluctuations.
PSAT is a Matlab-based suite for power system steady-state and dynamic analysis
PSAT is a Matlab toolbox for electric power system analysis and control. The command line version of PSAT is also GNU Octave compatible. PSAT includes power flow, continuation power flow, optimal power flow, small signal stability analysis and time domain simulation. All operations can be assessed by means of graphical user interfaces (GUIs) and a Simulink-based library provides an user-friendly tool for network design. ** This purchase price is for the instant delivery of electronic documentation, software is provided as open source **
Use these powerful search and filter tools to discover the licensing opportunities you are looking for.
Keep an eye here for the number of returned opportunities
The best matching results of your search and filters are displayed here. For further details on opportunity, click on the opportunity name!
Click here to view a summary of this opportunity.View Topics Back To Start