A Management System for Distributed Knowledge and Content Objects Wernher Behrendt, Nitin Arora, Rupert Westenthaler Salzburg Research, Austria {.}@salzburgresearch.at

Abstract We present the results of a European research project which developed specifications for so-called Knowledge Content Objects (KCO) and for an attendant infrastructure, the Knowledge Content Carrier Architecture (KCCA). The work addresses the problem that while there are many standards for content and for meta data, there is at present, no suitable framework that enables organizations to manage knowledge alongside content, in a coherent manner. Our approach postulates the KCO as a common structural entity which can be recognized and manipulated by a KCCA enabled system.

1. Many standards, little coherence and a semantic mediation architecture The digital world has developed many overlapping standards. The overlap arises partly from the need of any sector-specific standard to model part of the medium (e.g. certain properties of a video file, e.g. the encoding format) as well as part of the situational context in which the information objects are used (e.g. the notion of scenes in films or the notion of teaching units in computer assisted learning). This introduces an ontological bias into such standards, which then causes heterogeneous representations of potentially very similar content items. One increasing problem is that content management systems in different sectors lock their users into their de facto proprietary systems because it proves impossible to transfer added knowledge (meta data) from one type of system to another. An example would be the transformation of MPEG-7 meta data (see [1] for an introduction) into the Learning Object Model (e.g. when we want to use some A/V material in an elearning environment). Similar problems arise with any cross-sector content (E.g. moving an on-line news item about a new drug into a medical data management system).

Knowledge Content Objects (KCOs) define a very generic yet meaningful structure which is based on a so-called foundational ontology. KCOs are managed in the Knowledge Content Carrier Architecture (KCCA) which defines a minimal, distributed management infrastructure. Each KCCA "node" can take part in a federation of such nodes, communicating via a simple protocol - the knowledge content transfer protocol (KCTP) which is based on the FIPA Agent Communication Language [2]. This paper gives an overview of KCOs and KCCA. We have added a couple of usage scenarios to illustrate the approach.

2. Related Work In 1987 Tsichritsis et al. proposed the notion of a knowledge object (kno) which they envisaged in the context of office automation as a unit of exchange in collaborative environments [3] and which were designed as hybrid objects containing active as well as passive components. The KCOs proposed by us are similarly motivated (as units of information exchange) but we honour Dijkstra's strict distinction between data structure (KCO, passive) and process (KCCA nodes, active). However, KCOs are designed to carry state information with them so that the KCCA nodes can propagate information related to that specific KCO to each other. This aspect is indeed similar to the blackboard approach proposed by Tsichritsis et al. Two distinct extensions of KCOs are the facets which structure the information space semantically, and the use of a foundational ontology to derive all semantics from a single ontological basis, including the KCO itself. In the mid 1990s, a number of architectures appeared which used an ontology based approach to mediation between heterogeneous information systems. The WWW/HTML page quickly became the new "unit of exchange" and the object-based content models faded into obscurity as did knowledge based systems as a

whole. Projects in the US and in Europe were highly influenced by Wiederhold's original paper on mediators and facilitators 1992 [4]. Towards the late 1990s, the traditional web pages were getting challenged by sophisticated multimedia, virtual reality and time based hypermedia. This led to developments such as SMIL (Synchronised Multimedia Integration Language) and ZYX, a multimedia document model that allowed the description of complex events in a web based presentation system [5]. Also towards the end of the 1990s, it became clear to the research communities that sooner or later, all types of information systems and representation formalisms would have to prove their value against the WWW, and in a large variety of knowledge- and media intensive applications. This led to initiatives that brought the knowledge based systems communities back into the frame (through the Semantic Web). The communities also rediscovered the information integration problem from heterogeneous data sources. Many researchers realized that standardisation along sectorial boundaries was not going to be sufficient in a global information exchange system A good example of this is the work done by Bricklin, Hunter and Lagoze [6], combining multimedia and cultural heritage description standards. Work initiated by Guarino et al. [7] contributed to the realisation that interoperation fundamentally needs some shared denotational semantics against which applications could interpret statements in different representation formalisms. This led to renewed research into foundational or "upper" ontologies [8]. The contribution of knowledge content objects to all these developments is an engineering framework that allows us to place different kinds of system information into predefined semantic frames which are general enough to hold for any knowledge and any content, but which offer already at this level, distinct operational semantics which can be supported by third party systems. In order for the objects to be interoperable in the Semantic Web, the KCO model is defined in the OWL and uses the DOLCE foundational ontology for its modeling primitives. An overview of the ontological basis is given in [9].

3. Knowledge Content Objects The KCO is divided into six "facets" which can be regarded as semantically distinct "compartments" that can be filled with data and meta data. Content Description facet - this consists of three layers of description (facet elements): the meta data element describes how to access the content and gives

information about the format, encoding and the storage location of the source content. The content classification element allows the use of existing cataloguing and indexing standards. The propositional content element provides us with semantic interpretations about the subject matter of the content object, but not about the content object itself. Presentation facet - specifies how to present the content (e.g. as rendering information or by describing modes of interaction ...). The SMIL [10] standard is an example for information that is related to this facet. Community facet - specifies how the Content is typically used. The description includes tasks and the roles responsible for the execution of these tasks in the context of a defined community. Content and knowledge may be used differently by different communities, so there is the possibility of multiple descriptions. Business facet - specifies how to trade the content (by declaring a negotiation protocol, license conditions, price, etc.). Business processes define special tasks and roles related to some business activity. This facet can be viewed as a specialization of the community facet. Trust and Security facet: How to protect the content. In the KCO model, this facet was not developed to any great detail. Its intended usage is to enable trust metrics and to activate security features if this is necessary. Self Description facet - this declares the structure of the KCO itself, including active facets, ontologies used, and other KCO system related information.

3.1. Understanding content classification vs. Proposition The content description facet has three elements: the meta data element describes the properties of the medium rather than the themes or subject of the content. Content meta data can be encoding formats, colour schemes, ownership information, etc. The content classification element allows the reuse of cataloguing standards or sector specific classification schemes, e.g. the "IPTC Core Schema for XMP" (see http://www.iptc.org/IPTC4XMP/ ). The propositional description element contains any descriptions that relate to what the content is about. It must be noted that the "aboutness" of content can be extremely arbitrary For example, a Renaissance painting depicting Adam and Eve will show two naked persons, but it is quite likely that the Renaissance painting will be acceptable for use in an educational context whereas a photograph of a nude couple in a contemporary city park may be regarded as offensive or unsuitable for such use. At

face value, even good object recognition software will classify the two images as very similar, and will be unable to recognize the subtle differences that lead human interpretation to distinguish between the two items of content. We would accept any description as propositional content, because we treat such descriptions not as semantic invariants, but as a human construction of reality, i.e. an interpretation. Therefore, a video showing some boys playing football in a garden can be interpreted as just "boys playing football near Anfield Road", but the scene could also be described as "a typical example for the reinforcement of pettybourgois, suburban behaviour patterns". It could equally be interpreted as "the young football star in his early days, when he learned how to play the game".

3.2. Understanding Tasks vs. Business The so-called community facet specifies types of actors who can in principle, make use of the content of a KCO. Associated with the actors are tasks which can be carried out by the actors. The tasks are parameterized by the type of content that is input and output of the tasks. For example, suppose we have a KCO containing a multi-track music recording and some dancing scenes described by a choreographer. The task description may now specify that there exists a drummer (actor) and that track one (a content parameter) can be used for a basic beat whereas track two (another content parameter) is used for percussion. Track four, five and six are for bass, rhythm and lead guitar, respectively, whereas tracks seven and eight are used for the vocals. The task could be to produce a number of cover versions of well known "Rock Classics" where each of the songs has its own detailed "task model" which is adequately described by the sheet music. Additionally, we would have - as further tasks - the choreography of the dancing scenes for each of the songs. Here, the task could be specified in terms of the dancing moves as well as the camera movements in space whereas the content would refer to the dancing scenes filmed by the cameras. The business description might describe the following: 1. "you can use the bass and rhythm tracks as backing for your private use (e.g. for Karaoke singing) but it is necessary to purchase a permission for each public Karaoke event. This holds for each song that is used once or more, at that event." 2. "you may take one copy of the KCO for use on a different player." 3. "you may take part in a young talent competition by sending in the backing material with your own singing and you grant permission to the vendor of

the music KCO to use your singing for advertising purposes, on their web site. In return you will be allowed to download 10 more KCOs as you wish." The correct usage of the KCO could be monitored through the usage history sub-facet which is part of the Community Facet. In this facet, each use of the KCO by a player is recorded (or rather, can be recorded if this is part of the defined business model). One of the challenges that remain is the development of a sublanguage for the definition of such business models and rules. The current static modeling is not capable of expressing all the richness of business models which rely on a flexible "give and take" between consumers and producers, leading to the notion of "prosumers".

3.3. Understanding Business vs. Security The previous example shows how the meta data and knowledge level can be used in a creative manner: the system offers a capacity to define a novel business model which offers both sides flexibility and the possibility to engage in negotiation about mutually acceptable terms of use. What is important to note is the change of philosophy: we are interested in defining a business model and its semantics, rather than putting big locks on our content. This is due to a separation of concerns: the trust and security facet of the KCO is the place where one can put the "big locks" if needed. The trust aspect of this facet makes it possible for the KCO to carry metrics of usage around: how often has this KCO been copied, but not paid for? How well has the vendor reacted to customer queries? For this, the trust sub-facet can use information from the usage history. However, the usage history need not necessarily be used (indeed, it can be disabled if the vendor wants to guarantee anonymity to the users of KCOs). In current systems, it is the portal which offers such trust related statistics. In a future system, it is quite conceivable that each content object could carry a reference to the trust statistics which may be collected by a mutually trusted third party! It is in our view, important to provide an infrastructure that is able to support restrictive as well as liberal uses of content and where the terms of these uses are transparent to all parties involved in the contract.

4. Components of the KCCA The Knowledge Content Carrier Architecture (KCCA) along with the Knowledge Content Transfer

Protocol (KCTP) defines an infrastructure for efficiently managing KCOs. We envisage a scenario in which multiple KCCA nodes will communicate/cooperate and share or operate on data (KCOs) in a distributed environment. The architecture defines a generic middleware platform which is independent of any specific high level protocol and is based on existing Semantic Web Technologies. It provides a mechanism for storage of KCOs and also operations to merge such knowledge bases. Both at application layer level and data layer level in the architecture, the system provides flexibility so that non conformant applications or databases can be plugged in via appropriate wrappers. For effectively handling messages between the KCCA nodes and to provide for standardized communicating capabilities, a simple Request/ Response protocol is defined via KCTP. This also gives the liberty to the system provider to use his own implementation of the system taking into account the core functionality that needs to be fulfilled. The architecture can also support extensions developed on top of the basic middleware.

The View Profile is a specific view defined over one or more context profiles. The schema defined by the View Profile is the one that will be shared with the external world (see Fig. 1). At the lowest level, the view profile can correspond to the KCO schema with all its facets and facet elements.

4.1. Core Components The KCCA middleware consists of the following core components (See Figure 1). The KCCA Repository provides interfaces with databases for storage of content, metadata, ontologies and KCOs. It deals with how various databases, schemas & models can be integrated within the KCCA Middleware Architecture. Within the KCCA Middleware, an ontology (using OWL [11]) is taken as a starting point for conceptual modeling, and RDF [12] is used as the data model for modeling the domains. The KCCA does not build its own component for automatic schema mapping between multiple Data Schemas but defines a framework in which external mapping solutions (e.g. MAFRA [13], D2RQ [14]) can be plugged in. The KCCA provides the necessary interfaces for querying and updating RDF database schemas and RDF instances, thus enabling multiple database systems to be plugged in to the system. Within the architecture we split up the RDF schema into two different schema types (Context Profile & View Profile) depending on the respective roles each one plays. The Context Profile is a naïve one-to-one mapping of an internal data model to an RDF data model. The Context Profiles will normally be kept internal within a system and need not be shared with the external world (see Fig. 1).

Figure 1: Overview of KCCA Components 4.1.1. KCCA Repository As indicated above, the KCO Schema acts as a View Profile within the Repository. The Context Profile for the KCO Schema is the same as the View Profile except in certain cases where systems may have proprietary extensions to the KCO Schema. Such extensions can be supported by changing the Context Profile of the KCO and by providing a suitable mapping between the Context Profile and View Profile of the KCO Schema. The particular instances of KCOs exist as 'Context' within the Repository and in KCCA they exist as ‘View’.

4.1.2. KCCA Middleware Components KCCA Middleware Components provide specific components and modules that enable building up of the actual middleware. The Components include: Presentation Layer, Authentication, Workflow Engine, Session Management, Inference Engine, Rule Layer and System Registry. Most of the components exist as either stand alone components e.g. Workflow engines supporting task execution within a workflow (e.g. Business Process Execution Language (BPEL [15])) or as part of another web application framework (e.g. J2EE [16] platform includes authentication and session tracking). The Presentation Layer in web information systems helps generate hypermedia presentations dynamically from resources stored within databases. The presentation layer deals with issues, such as Presentation Delivery - Content Format (HTML [17] for web, SMIL), Navigation Structure of the Presentation, Composition of presentation resources dynamically, User Interaction and Personalization. The presentation layer will be based on the domain specific information which in our case will be guided by that domain specific ontology. The Global System Registry acts as a simple HTTP GET/POST Registry: A simple look up for resources is provided via a minimal Registry Schema. Any client can access the Registry Server and can ask for information for System Resources for any peer node that the Registry Server knows of. Global Registry stores the local registry information for each node and it follows the heartbeat mechanism to know about inactive nodes in the cluster. Heart beat mechanism periodically polls the nodes and update their status information in the Global Registry Server. The Local System Registry stores all the relevant information related to a local KCCA node like the access URI for the node and the services it supports, etc. 4.1.3. KCCA Services Container (Request Broker) KCCA Services Container provides support for system and domain level services. The system level services include services for accessing KCCA Repositories, KCCA Middleware components and KCCA System Registry. It also includes KCO services which provide access and manipulation of KCOs with operations such as querying or addition or deletion from a particular facet of a domain specific KCO.

The domain level services include services for the application domains and services related to multimedia systems (digital rights management, etc).

4.2. KCTP The Knowledge Content Transfer Protocol (KCTP) provides a common middleware upon which multiple services providing varied functionality can be built. KCTP is a light-weight request/response protocol for sharing content and knowledge amongst multiple, distributed KCCA nodes (in database terms, a federation of KCCA nodes and their KCO repositories which may be implemented using heterogeneous databases). KCTP is defined via a small ontology/schema expressed in RDF. The KCTP consists of two core parts: KCTP Profiles and KCTP Request/ Response Protocol as described below. 4.2.1. KCTP Profiles This provides a description of a protocol for the following: • Sharing KCCA related system information such as protocol bindings, data encodings, repositories etc. • Support for stateless operations for querying and updating data between multiple systems • Doing state oriented communication • Describing tasks and services at semantic level and to operate on KCOs. 4.2.2. KCTP Request/Response Protocol KCTP provides a protocol for communication between multiple KCCA nodes. The KCTP Message format is based on FIPA ACL standards [2]. Table 1 describes the KCTP message fields (with a comparative view with FIPA ACL Format [18]). FIPA ACL Message

KCTP Message

Performative

Performative

Sender

Sender

Receiver

Receiver

Reply To

Reply To

DESCRIPTION Denotes the type of communicative act [18] in KCTP. The performative can be a request (add, delete, query, merge and update), reply or Not Understood (Error). Denotes the identity of the sender. Denotes the identity of the receiver. Subsequent reply (if any) for the received message is to be directed to the node mentioned in this parameter, instead of the actual sender.

In Reply To

In Reply To

Protocol

Message Protocol

Reply By

Reply By

Encoding -

Message Encoding Creation Time

-

Service URI

Type

-

Service URI

-

Access URI

-

End Point Identifier

-

MessageID

Denotes an expression that references an earlier action to which this message is a reply. Denotes the interaction protocol that the sending agent is employing with this message. Denotes a time and/or date expression which indicates the latest time by which the sending node would like to receive a reply. The time will be expressed according to the sender’s view of the time on the sender’s platform. The reply message will be identified as the next sequential message in an interaction protocol, through the use of the In Reply To parameter. To specify this encoding to the recipient agent. Contains the creation date and time of the message header added by the sender. Contains the information about the type of requested/ response service. Denotes the address of the requested/ response service. Together with Service Type URI & Access URI it gives the full address of the service. For instance: accessURI::ServiceTypeURI:: ServiceURI. Denotes the access address of a particular node/endpoint. Denotes the identifier (for instance a name in string) for the end point. Denotes the identifier for a particular message.

Table 1: KCTP message fields (in (in comparison comparison view with the FIPA ACL Message Format) Format) A KCTP message can be of the type: Reply or Request. A Request can be a Query Request, an Add Request, a Delete Request, an Update Request or a Merge Request which has appropriate parameters associated with it. A Query Request will have two mandatory parameters which are query string and query language (for instance RDQL [19]) and optional parameter – query constraints (if any). An Add/Delete/Update request has two mandatory parameters which are a RDF model (set of statements) to be added/ deleted/updated and the KCO Facet description. A Merge request has the set of URIs of the respective KCOs as mandatory parameter. A reply can either be an error if the request was not fulfilled with appropriate error descriptions or a successful reply with the necessary result set.

Both Sender and Receiver will be KCCA nodes and will have similar properties like the endpoint ID, access URI and the services that they offer respectively. 4.2.2. KCTP Handler The central component in KCCA for handling KCTP Messages is known as the KCTP Handler. KCTP Handler is the central entity to receive process and send KCTP Messages. This component is the core component for communicating with other nodes. It has the following sub-components: • HTTP Sender/Receiver Servlet [20] This component is responsible for sending Request/Response messages to and receiving Request/Response messages from other KCCA systems. It verifies the HTTP Message Header and forwards the message to the HTTP Sender/Receiver component. • HTTP Sender/Receiver This component acts as a connecting component between Serializer, De-Serializer and KCTP Router components. It passes the Serialized Message to the De-Serializer and then the de-serialized message to KCTP Router for processing. It receives the KCTP Message from the KCTP Router, forwards it to the Serializer for serialization. • Serializer/ De-Serializer These components do the respective serialization and de-serialization of messages. • KCTP Router This component manages the routing of a KCTP Message to and from the Services Container. It reads the KCTP message and does the necessary action. For instance: if the KCTP Message is of the type response from another KCCA node for a request earlier sent, then it forwards the response appropriately to the application layer. The KCTP Router also interacts with the KCCA Local System Registry to check/get/register Services. In the case where the service desired by the request KCTP Message is not present, an error message is sent back else the KCTP message is sent to the appropriate service in the services container for further processing.

5. Multimedia usage example for KCOs The flexible and generic nature of KCOs makes them an attractive framework for managing content and the associated meta data coherently. In this section, we describe certain scenarios in which we highlight the usage of KCOs.

Let us now consider a variation of our previous scenario for a musical video which adapts itself depending on the cultural context in which the KCO is used. In other words, different variants of the tracks (the video and the audio) get played in accordance with the cultural considerations. The overall music video in this case is represented as a KCO. The actual multimedia data is stored or referred to in the Content Description Facet of the KCO. Multimedia data consists of video and audio clips which are played in a temporally overlapping pattern to compose the actual music video. The video clips are merged with the audio clips to create the overall musical and visual effect. The audio clips consist of alternative tracks of playback music, types of lyrics sung by various artists, etc. We are aware of the fact that this may lead to heated debate about the artistic identity of a work of art - the recent (2006) political and cultural tensions concerning caricatures of Muslim religious figures are an interesting test case to discuss such identity and the contexts that define how (and by whom) a work of art is perceived and is intended to be perceived. Nonetheless, we suggest that the proposed model gives all parties more control over the usage of (and exposure to) digital content. The propositional description will contain a semantic description about the subject matters of the KCO which in our case is any information intended for the audience. These descriptions may include the names of the artists, the genre of the multimedia, the types of musical instruments used. The meta data element will contain technical information about the content such as the compression rates used, but also possibly information about encoding formats (e.g. the percussion track is computer-generated on a "Boss D888" drum computer). The Presentation Facet contains descriptions of the actual pattern in which the media files are mixed to compose the overall music video which conforms to a cultural standard. For instance the track will play in a different language (of the lyrics; that is with a different audio clip) and with different overlay percussion instruments and possibly even with a different video depending on the target audience. The Community Facet will contain the possible task descriptions (e.g. this KCO can be used for Karaoke singing or as playback for amateur bands). The usage history may keep track which cultural variants of the track were played where, who played it, what modifications if any were done by the user, the date, time and the place where it was played, etc. Note again, that the recording of such history may be enabled or disabled, and even when it is enabled, the user should

know whether or not the information is sent outside the KCO. By constraining such information to the actual object we give control back to the user, which is currently being eroded through the use of "cookies" which are controlled by the provider of the content without adequate consideration of the users' right for privacy. The Business Facet contains descriptions about the digital rights issues, negotiation protocol and the pricing for each multimedia clip. The Trust and Security facet aspect contains digital signatures for all or some of the content with the description of the appropriate issuing authorities and security description like which agents are allowed to access (read or write or update, etc) which parts of the KCO. The Self Description Facet contains information about the structuring of the KCO itself. When the KCO is read into a multimedia agent/ application for playback, depending on the specified cultural description, the composition and the playing of the tracks is done by applying the appropriate descriptions provided in the Presentation Facet. An alternative usage scenario for KCOs could be a multimedia documentary about some of the sociopolitical events in the world over the years, and supplemented with the help of music video tracks. For example in the 1980s when the fear of nuclear war seemed to be much on people's minds, there was a whole genre of music referring to that fear. For instance "Nuclear Holocaust" by The Future or "After The Holocaust" by Nuclear Assault. The KCO in this case represents the whole multimedia documentary with the music video tracks with appropriate descriptions about what event(s) each track relates to. It will also include the references (hyperlinks) for appropriate news items which describe particular events or chain of events, as well as for images or any other relevant multimedia source. The Content Facet of the KCO will contain the actual multimedia or references (hyperlinks) to their actual data source. The Propositional Facet will contain semantic descriptions about the tracks and will contain the links to the necessary reference materials pertaining to the events on which the tracks were based. The Presentation Facet contains the description on pattern of play for the tracks (for instance, this can be the chronological order of the events/ songs) with the associated semantic descriptions of the events or flashing pictures of the events inside the video presentation, etc.

6. Conclusions and Further Work In a recently finalized EU-funded research project, we developed the specification of KCOs as well as a prototype system of KCCA nodes which can manage the KCOs in their respective repositories. We have also developed a methodology for introducing this novel type of information system into an organization, while also considering the integration of existing information systems and repositories into a federation of KCCA nodes. In follow-on projects at national and international levels, we are now applying the infrastructure to new domains such as TV broadcasting of live events, in learning management systems and as back-end technology for so-called semantic Wikis where we are experimenting with the KCO schema as the basis for an interchange format at the semantic level, between different Wikis.

7. Acknowledgements

http://jodi.ecs.soton.ac.uk/Articles/v02/i02/Lagoze/lagozefinal.pdf [7] Guarino, N., Welty, C., Evaluating Ontological Decisions with OntoClean, Communications of the ACM – Special Issue: Ontology applications and designs, Volume 45, Issue 2, ACM Press, pp. 61-65, 2002. [8] Gangemi, G. et al, Sweetening Ontologies with DOLCE, in Proceedings of the 13th International Conference on Knowledge Engineering and Knowledge Management (EKAW'02), Springer, Sigüenza, Spain, pp. 166-181, October, 2002. [9] Behrendt, W., Gangemi, A., Maass, W., & Westenthaler, R. (2005). Towards an Ontology-based Distributed Architecture for Paid Content.. In Gómez-Pérez, A., & Euzenat, J. . (Eds.), Springer (pp. pp. 257). Berlin. [10] Bulterman D., Grassel G., Jansen J., Koivisto A., Layaïda N., Michel T., Mullender S., and Zucker D., “Synchronized Multimedia Integration Language (SMIL 2.1)”, W3C Recommendation, 13 December 2005,

We would like to acknowledge fruitful discussions with our colleagues Wolfgang Maass (FH Furtwangen, Business Models), Aldo Gangemi (CNR Rome, DOLCE Ontology) and Tobias Bürger (Salzburg Research, Multimedia Standards Modeling). The work reported here was part-funded by the EU projects METOKIS (contract number IST-FP6-507164, metokis.salzburgresearch.at) and LIVE (contract number FP6 – 27312, http://www.ist-live.org/).

[11] Web Ontology Language, W3C Recommendation, 10 Feb 2004, http://www.w3.org/2001/sw/WebOnt/

8. References

[14] Bizer C., Cyganiak R., and Garbers J., “D2RQ V0.3 Treating Non-RDF Relational Databases as Virtual RDF Graphs”.

[1] José M. Martínez,. Rob Koenen, and Fernando Pereira., “MPEG-7: The Generic Multimedia Content Description Standard”, Part 1. IEEE MultiMedia, 9(2): 78-87, 2002. [2] Foundation for Intelligent Physical Agents, http://www.fipa.org/. [3] Tsichritzis, Fiume, Gibbs, Nierstrasz, KNOs: Knowledge acquisition, dissemination, and manipulation objects - ACM Trans. on Office Information Systems 5, 2 (April 1987) [4] G. Wiederhold, Mediators in the architecture of future information systems. IEEE Computer, 25(3), 1992, pp 38-49. [5] Susanne Boll and Wolfgang Klas, ZYX-A Multimedia Document Model for Reuse and Adaptation of Multimedia Content, IEEE Trans. Knowl. Data Eng., vol 13/3, 361-382 pp., 2001. [6] Carl Lagoze and Jane Hunter, The ABC Ontology and Model, Journal of Digital Information, Volume 2 Issue 2 Article No. 77, 2001-11-06

[12] Lassila O., and Swick R.R., “Resource Description Framework (RDF) Model and Syntax Specification”, W3C Recommendation, World Wide Web Consortium (W3C), February 1999. [13] A Mapping Framework for Distributed Ontologies, http://sourceforge.net/projects/hmafra

[15] Business Process Execution Language for Web Services version 1.1., 30 Jul 2002, http://www-106.ibm.com/developerworks/library/ws-bpel/ [16] Java 2 Enterprise Edition J2EE, http://www.java.sun.com/j2ee/ [17] Raggett D., Hors L. A., and Jacobs I., “HTML 4.01 Specification”, W3C Recommendation, 24 December 1999. [18] Foundation for Intelligent Physical Agents, FIPA ACL Message Structure Specification, 03/12/2002, http://www.fipa.org/specs/fipa00061/ [19] Seaborne, A., “RDQL - A Query Language for RDF”, W3C Member Submission, 9 January 2004. [20]Java Servlet technology http://java.sun.com/products/servlet/

A Management System for Distributed Knowledge ...

Many standards, little coherence and a ... ontological bias into such standards, which then causes ... about a new drug into a medical data management system).

164KB Sizes 1 Downloads 346 Views

Recommend Documents

A Management System for Distributed Knowledge and ... - CiteSeerX
to transfer added knowledge (meta data) from one type of system to ... from heterogeneous data sources. ... Business facet - specifies how to trade the content.

A Management System for Distributed Knowledge and ... - CiteSeerX
to transfer added knowledge (meta data) from one type of system to .... a business model and its semantics, rather than putting big locks on our content. This is ...

A distributed system architecture for a distributed ...
Advances in communications technology, development of powerful desktop workstations, and increased user demands for sophisticated applications are rapidly changing computing from a traditional centralized model to a distributed one. The tools and ser

Secure and Distributed Knowledge Management in Pervasive ...
2 Department of Information and Communication Systems Engineering University of the. Aegean ... solutions observed in the past few years and the high rates of ..... of the Education and Initial Vocational Training. Program – Archimedes. 7.

Evaluation of an Ontology-based Knowledge-Management-System. A ...
Fur- thermore, the system enables the efficient administration of large amounts of data in accordance with a knowledge management system and possesses the ...

Trama-A-Web-based-System-to-Support-Knowledge-Management ...
Try one of the apps below to open or edit this item. Trama-A-Web-based-System-to-Support-Knowledge-Management-in-a-Collaborative-Network.pdf.

BigTable: A System for Distributed Structured ... - Research at Google
2. Motivation. • Lots of (semi-)structured data at Google. – URLs: • Contents, crawl metadata ... See SOSP'03 paper at http://labs.google.com/papers/gfs.html.

A Distributed Display System for Interactive Sketching
A Distributed Display System for Interactive Sketching. Brian P. Bailey. Department of Computer Science ... away to make it more 'public' - enabling others to view the design without awkwardly leaning over the designer. ... Incorporating Passive Info

A Distributed Multi-Agent System for Collaborative ...
collaborative agents to help users access, manage, share and exchange information. ... superior to unstructured lists, hierarchical folder organization forces users to think in ..... Yates R., Information Retrieval, Data Structures and. Algorithms ..

A Distributed Multi-Agent System for Collaborative ...
Mail Stop 269-2 ... aided by easy sharing utilities as well as automated information .... between agents is supported with automatic indexing methods in.

A Web Based Workflow System for Distributed ...
Dec 9, 2009 - Keywords: scientific workflow; web service; distributed atmospheric data ... gird infrastructure as shared resources. The core service in.

Implementing a Distributed Execution System for ... - Flavio Figueiredo
execution service that allows for load balancing and improves MyGrid performance. A checkpointing ... system by using a grid middleware; with it the user can access a variety of services, such as: resource management, security ... Local Area Network

LNBI 4360 - A Distributed System for Genetic Linkage ... - Springer Link
We present a distributed system for exact LOD score computations, called .... application on a single CPU, defined as a portion of CPU-bound operations in the over- ..... D., Foster, I.: Scheduling in the grid application development software.

Implementing a Distributed Execution System for ... - Flavio Figueiredo
FINAL phase. 3.2. Reliable Communication. 3.2.1. Contacts from the MyGrid broker to the Master and from the Master to the Slaves. In the system three messages are sent from the MyGrid broker to the Master, they are: 1. Use. Service; 2. Execute Replic

Bigtable: A Distributed Storage System for ... - Research at Google
service consists of five active replicas, one of which is ... tains a session with a Chubby service. .... ble to networking issues between the master and Chubby,.

DRMonitor – A Distributed Resource Monitoring System
classroom, involving 10 personal computers, is analyzed. Section 6 reviews related ... IP address: this is the node's Internet Protocol address. The IP address ...

A Knowledge Acquisition System for Constraint-based ...
four-phase approach: building a domain ontology, acquiring syntax constraint di- rectly from it, generating ... by using machine learning techniques [2]. Most existing systems ... ganise constraints into meaningful categories. This enables the ...

SPOOK: A system for probabilistic object-oriented knowledge ...
In Proceedings of the Fifteenth Annual Conference on Uncertainty in Artificial Intelligence (UAI-99),. pages 541-550 .... Ak is a simple attribute, we call the attribute chain simple. ... as parents provides a way for the attributes of an object to.

Distributed Operating System
IJRIT International Journal of Research in Information Technology, Volume 1, ... control unifies the different computers into a single integrated compute and ... resources, connections between these processes, and mappings of events ... of excellent

Distributed File System
Hadoop file. System. Clustered- based, asymmetric. , parallel, object based. Statef ul ... File System http://hadoop.apache.org/core/docs/current/hdfs_de sign.html.

A Distributed Service Management Infrastructure for ...
mainly developed for the file-sharing applications running on desktops, which ... Peer-to-Peer Overlay Network. SLA-aware scheduling config change monitoring. CMDB. P2P. Substrate. S LA attainment. & reporting. SLA alert prioritization service reques

Distributed Node with Distributed Quota System (DNDQS).pdf ...
There was a problem previewing this document. Retrying... Download. Connect more apps... Try one of the apps below to open or edit this item. Distributed ...

A Spatial Hypertext Wiki for Architectural Knowledge Management
The absence of a disciplined approach for capturing and managing architectural knowledge causes the loss of substantial knowledge generated during the software architecture process. This paper describes the use of a Spatial Hypertext Wiki (ShyWiki) a

Adaptive Response System for Distributed Denial-of-Service Attacks
itself. The dissertation also presents another DDoS mitigation sys- tem, Traffic Redirection Attack Protection System (TRAPS). [1], designed for the IPv6 networks.