Evolving Granular Modeling from Uncertain Data Streams Daniel Leite Federal University of Lavras – UFLA, Brazil

Plenary talk at the IEEE Conference on Evolving and Adaptive Intelligent Systems EAIS’16 – Natal, May 2016

Summary Part I – Granular Computing Part II – Evolving Granular Systems Part III – Applications Examples

Part I – Granular Computing

What is Granular Computing? Dictionary • Granular: composed of granules • Granule: a small amount of something • Granularity: the extent to which a system is subdivided in smaller parts

Granular Computing (GrC) • Problem solving based on different levels of granularity (detail/abstraction) • Different levels of granularity are essential for humans to solve problems

Example: boiling egg Rule Drop the egg in boiling water for 5 minutes

Problem Sometimes egg too soft Sometimes egg too hard

Suggestion IF

THEN

5 min

IF

THEN

5 min

Solution Higher granularity of the attribute ‘size’

Example: packing bag

Solution Lower granularity of the attribute ‘temperature’

Granularity and abstraction “We look at the world under various grain sizes and abstract from it only those things that serve our present interest”. Hobbs, 1985. “Abstraction allows people to consider what is relevant and to forget irrelevant details which would get in the way of what they are trying to do”. Giunchiglia, 1992.

Reflection We • perceive and represent the world in different levels of granularity • understand problems and solutions in different levels of abstraction • adopt appropriate levels of granularity and change granularity easily

GrC inspiration • Capture basic principles used by humans to solve problems • Hierarchical notion - Low level → detailed/precise concepts - High level → abstract/imprecise concepts

Necessary and sufficient granularities

GrC • Theories and methods that make use of granules to solve problems • Subset of a universe → granule • GrC basic ingredients - Subsets - Classes - Clusters

Fundamental concepts • Granule: interval, fuzzy set, rough set, cluster, … • Granulation - Decomposition: coarse → fine (boiling egg) - Construction: fine → coarse (packing bag)

• Granular structure: a set of granules • Relationship, hierarchy

Multidisciplinarity • There is no unified framework for GrC • Different names in correlated areas: - Fuzzy and rough sets - Interval mathematics - Divide and conquer - Quotient space theory - Information fusion, ...

Research • GrC is a research area by itself: it has its own principles, theory and applications • Zadeh, 1997, 2005: “GrC is a superset of the theory of fuzzy information granulation, rough set theory and interval computations” “A generalized theory of uncertainty”

Historical notes • 1979, Zadeh discusses information granulation • 1997, T. Y. Lin suggests the term GrC. A group, BISCGrC, is formed • 2004, IEEE CIS Task Force on GrC is proposed • 2005, 1st IEEE International Conference on GrC • 2009, Journal of GrC, Rough Sets and Intelligent Systems • 2016, Springer Journal of GrC

GrC Publications • IEEExplore, ScienceDirect • Search (metadata only) - Granular computing - Granular fuzzy - Information granulation/granularity

Bargiela, A.; Pedrycz, W. Granular Computing: An Introduction. Kluwer Academic Pub. Boston, 2004

Pedrycz, W.; Gomide, F. Fuzzy Systems Engineering: Toward Human-Centric Computing. Wiley-Hoboken, 2007

Pedrycz, W.; Skowron, A.; Kreinovich, V. Handbook of Granular Computing. Wiley-Chichester, 2008

Part I: final remarks • GrC: a searchable term in bibliographic databases • GrC shares generalities of many domains • Research grows → long way ahead → requires interaction, dissemination

Part II – Evolving granular systems

Challenges in data analysis

• Data increase in volume, speed, uncertainty • Analyze data as it occurs • Make sense of uncertain data • Online learning human-centered systems

Motivation State-of-the-art machine learning needs methods and algorithms to: • build models of complex systems • process time-varying data streams • recursively adapt parameters and structures • deal with different types of data

Evolving granular systems Features • Incremental learning from online data streams • Gradual structural development • Models: rule base, neural network, ... • A GrC aspect: representation, parameter, data, …

How do granular data arise? 1 - Measurements e.g. from unreliable sensors

→ true value, certainly included

2 - Expert judgement e.g. medical diagnosis, risk analysis

Patient Interview

Physical Exam

Signs, symptoms, complaint

Past, present

Diagnosis Medical tests

Medical history

Past, present

Patient’s, family’s

3 - Imprecision in pre-processing steps Missing data, outliers Data cleaning, integration, transformation

4 - Summaries of numeric data over the time 1

1

1

1.5

1.5 Time

Central tendency, dispersion, dependence

Time and space granulation

Singular approximation

Granular approximation

Learning from granular data streams Begin Do 1: Read data sample 2: Fit the data 2.1: Create new granule 2.2: Adapt granules and associated information 3: Delete data sample 4: Update the granular structure End

1. Interval Based evolving Modeling IBeM features: • Processes interval data • Develops and adapts interval rules • Does not allow granules overlapping

Examples of interval data • Temperature between 20 and 27oC • Speed over 40Km/h and lower than 60Km/h • Body mass index between 18 and 25Kg/m2

IBeM rule:

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity Covers gaps Merges granules Deletes inactive rules Approximates functions

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity Covers gaps Merges granules Deletes inactive rules Approximates functions

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity Covers gaps Merges granules Deletes inactive rules Approximates functions

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity Covers gaps Merges granules Deletes inactive rules Approximates functions

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity Covers gaps Merges granules Deletes inactive rules Approximates functions

Contracts

Expands

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity Covers gaps Merges granules Deletes inactive rules Approximates functions

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity

Gap

Covers gaps Merges granules Deletes inactive rules Approximates functions

Neighbors

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity Covers gaps Merges granules Deletes inactive rules Approximates functions

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity Covers gaps Merges granules Deletes inactive rules Approximates functions

IBeM learning algorithm Creates and adapts granules and rules Refines existing granules Sets model granularity Covers gaps Merges granules Deletes inactive rules Approximates functions

2. Evolving fuzzy modeling FBeM main features: • Learn from fuzzy data • Union of local fuzzy models • Accurate and linguistic outputs

*FBeM: Fuzzy set Based evolving Modeling

Examples of fuzzy data • Gasoline price around R$3.50 • Atmospheric pressure approximately 101325Pa • Electrical current of 10A more or less 0.2A

FBeM rule:

Granulation of fuzzy data

3. Evolving neuro-fuzzy modeling eGNN features: • Learn from fuzzy data • Adaptation of granules, neurons, connections • Information fusion: aggregation neurons

*eGNN: evolving Granular Neural Network

Fuzzy aggregation neuron

Example:

is a uninorm

Uninorm: Umin,max, e = 0.3, v = 0

Structure eGNN

Structure eGNN

Structure eGNN

Structure eGNN

Structure eGNN

Structure eGNN

Structure eGNN

Part II: final remarks • IBeM, FBeM and eGNN • Deal with current challenges in data analysis • Granular data stream modeling

Part III – Applications Examples

1. Semi-supervised classification Twin Gaussians rotation (concept drift)

• (x,C)[h], h = 1, ... 1 ≤ h ≤ 200 201 ≤ h ≤ 400

Data stream Stationary concept Gradual rotation of 90o

eGNN (before) Right/Wrong rate 189/11 (94.5%) 5 granules

eGNN (after) Right/Wrong rate 195/5 (97.5%) 5 granules

ROC analysis

New class (concept shift)

• (x,C)[h], h = 1, ... 1 ≤ h ≤ 199 h = 200 201 ≤ h ≤ 400

Data stream Two stationary classes Concept shift Three stationary classes

eGNN (before) Right/Wrong rate 189/11 (94.5%) 6 granules

eGNN (after) Right/Wrong rate 185/15 (92.5%) 8 granules

Semi-supervised classification Rotating Gaussians problem

New class problem

2. Time series prediction Weather prediction • monthly data in 5 metheorological stations • period: Jan 1871 – Dec 2010 • min, average, max temperature • inputs: x[h-11], ..., x[h] • outputs: y[h+1]

FBeM prediction for Ottawa

Granular prediction

Comparing predictors

3. Function approximation Parkinson telemonitoring • 5875 biomedical voice measurements • 42 early-stage Parkinson’s diseased individuals • 16 inputs: shimmer/jitter attributes • 1 output: total UPDRS

Attribute selection • Leave-one-variable-out method: progressively eliminates input variables

FBeM

Performance comparison

4. Model-free control Autonomous navigation

IBeM

FBeM

eGNN

FBeM granular output

5. Model-based control Self-tuning PDC evolving granular regulator

Fuzzy granular control rule

• Assume fuzzy Lyapunov function • Proof of a theorem for closed-loop stability with bounded inputs: an LMI feasibility problem

Lorenz attractor

Stabilization of chaos

Part III: final remarks Evolving granular systems applied to: • Semi-supervised classification • Time series prediction • Function approximation • Control

Conclusion • Evolving granular models and controllers • Efficient to deal with time-varying data streams • Granular and/or numerical data • Provide precise and linguistic outputs

Open issues 1 - New time and space granulation methods 2 - Similarity measures and aggregation operators for granular data 3 - Incremental feature selection and data imputation 4 - Convenience of granular parameters 5 - Optimal granularity 6 - Big data problems and complexity issues 7 - Evolving approximation theorems

12 Plenary Talk - Evolving Granular Computing.pptx

Learning from granular data streams. Begin. Do. 1: Read data sample. 2: Fit the data. 2.1: Create new granule. 2.2: Adapt granules and associated information. 3: Delete data sample. 4: Update the granular structure. End ...

5MB Sizes 0 Downloads 200 Views

Recommend Documents

No documents