Change search
Refine search result
1 - 22 of 22
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Rows per page
  • 5
  • 10
  • 20
  • 50
  • 100
  • 250
Sort
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
  • Standard (Relevance)
  • Author A-Ö
  • Author Ö-A
  • Title A-Ö
  • Title Ö-A
  • Publication type A-Ö
  • Publication type Ö-A
  • Issued (Oldest first)
  • Issued (Newest first)
  • Created (Oldest first)
  • Created (Newest first)
  • Last updated (Oldest first)
  • Last updated (Newest first)
  • Disputation date (earliest first)
  • Disputation date (latest first)
Select
The maximal number of hits you can export is 250. When you want to export more records please use the Create feeds function.
  • 1.
    Aghighi, Meysam
    et al.
    Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
    Bäckström, Christer
    Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
    Jonsson, Peter
    Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
    Ståhlberg, Simon
    Linköping University, Department of Computer and Information Science, Software and Systems. Linköping University, Faculty of Science & Engineering.
    Refining complexity analyses in planning by exploiting the exponential time hypothesis2016In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 78, no 2, p. 157-175Article in journal (Refereed)
    Abstract [en]

    The use of computational complexity in planning, and in AI in general, has always been a disputed topic. A major problem with ordinary worst-case analyses is that they do not provide any quantitative information: they do not tell us much about the running time of concrete algorithms, nor do they tell us much about the running time of optimal algorithms. We address problems like this by presenting results based on the exponential time hypothesis (ETH), which is a widely accepted hypothesis concerning the time complexity of 3-SAT. By using this approach, we provide, for instance, almost matching upper and lower bounds onthe time complexity of propositional planning.

  • 2.
    Alberti, Marco
    et al.
    Dept. of Computer Science, Nova Universidade de Lisboa, Lisbon.
    Dell'Acqua, Pierangelo
    Linköping University, Department of Science and Technology, Media and Information Technology. Linköping University, The Institute of Technology.
    Pereira, Luis M
    Dept. of Computer Science, Nova Universidade de Lisboa, Lisbon.
    Observation Strategies for Event Detection with Incidence on Runtime Verification: Theory, Algorithms, Experimentation2011In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 62, no 3-4, p. 161-186Article in journal (Refereed)
    Abstract [en]

    Many applications (such as system and user monitoring, runtime verification, diagnosis, observation-based decision making, intention recognition) all require to detect the occurrence of an event in a system, which entails the ability to observe the system. Observation can be costly, so it makes sense to try and reduce the number of observations, without losing full certainty about the event’s actual occurrence. In this paper, we propose a formalization of this problem. We formally show that, whenever the event to be detected follows a discrete spatial or temporal pattern, then it is possible to reduce the number of observations. We discuss exact and approximate algorithms to solve the problem, and provide an experimental evaluation of them. We apply the resulting algorithms to verification of linear temporal logics formulæ. Finally, we discuss possible generalizations and extensions, and, in particular, how event detection can benefit from logic programming techniques.

  • 3.
    Bialek, Lukasz
    et al.
    Univ Warsaw, Poland.
    Dunin-Keplicz, Barbara
    Univ Warsaw, Poland.
    Szalas, Andrzej
    Linköping University, Department of Computer and Information Science, Artificial Intelligence and Integrated Computer Systems. Linköping University, Faculty of Science & Engineering. Univ Warsaw, Poland.
    A paraconsistent approach to actions in informationally complex environments2019In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 86, no 4, p. 231-255Article in journal (Refereed)
    Abstract [en]

    Contemporary systems situated in real-world open environments frequently have to cope with incomplete and inconsistent information that typically increases complexity of reasoning and decision processes. Realistic modeling of such informationally complex environments calls for nuanced tools. In particular, incomplete and inconsistent information should neither trivialize nor stop both reasoning or planning. The paper introduces ACTLOG, a rule-based four-valued language designed to specify actions in a paraconsistent and paracomplete manner. ACTLOG is an extension of 4QL(Bel), a language for reasoning with paraconsistent belief bases. Each belief base stores multiple world representations. In this context, ACTLOGs action may be seen as a belief bases transformer. In contrast to other approaches, ACTLOG actions can be executed even when the underlying belief base contents is inconsistent and/or partial. ACTLOG provides a nuanced action specification tools, allowing for subtle interplay among various forms of nonmonotonic, paraconsistent, paracomplete and doxastic reasoning methods applicable in informationally complex environments. Despite its rich modeling possibilities, it remains tractable. ACTLOG permits for composite actions by using sequential and parallel compositions as well as conditional specifications. The framework is illustrated on a decontamination case study known from the literature.

  • 4.
    Boström, Henrik
    et al.
    Stockholm University, Faculty of Social Sciences, Department of Computer and Systems Sciences.
    Linusson, Henrik
    Lofstrom, Tuve
    Johansson, Ulf
    Accelerating difficulty estimation for conformal regression forests2017In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 81, no 1-2, p. 125-144Article in journal (Refereed)
    Abstract [en]

    The conformal prediction framework allows for specifying the probability of making incorrect predictions by a user-provided confidence level. In addition to a learning algorithm, the framework requires a real-valued function, called nonconformity measure, to be specified. The nonconformity measure does not affect the error rate, but the resulting efficiency, i.e., the size of output prediction regions, may vary substantially. A recent large-scale empirical evaluation of conformal regression approaches showed that using random forests as the learning algorithm together with a nonconformity measure based on out-of-bag errors normalized using a nearest-neighbor-based difficulty estimate, resulted in state-of-the-art performance with respect to efficiency. However, the nearest-neighbor procedure incurs a significant computational cost. In this study, a more straightforward nonconformity measure is investigated, where the difficulty estimate employed for normalization is based on the variance of the predictions made by the trees in a forest. A large-scale empirical evaluation is presented, showing that both the nearest-neighbor-based and the variance-based measures significantly outperform a standard (non-normalized) nonconformity measure, while no significant difference in efficiency between the two normalized approaches is observed. The evaluation moreover shows that the computational cost of the variance-based measure is several orders of magnitude lower than when employing the nearest-neighbor-based nonconformity measure. The use of out-of-bag instances for calibration does, however, result in nonconformity scores that are distributed differently from those obtained from test instances, questioning the validity of the approach. An adjustment of the variance-based measure is presented, which is shown to be valid and also to have a significant positive effect on the efficiency. For conformal regression forests, the variance-based nonconformity measure is hence a computationally efficient and theoretically well-founded alternative to the nearest-neighbor procedure.

  • 5.
    Boström, Henrik
    et al.
    Stockholms universitet, Institutionen för data- och systemvetenskap.
    Linusson, Henrik
    Löfström, Tuve
    Johansson, Ulf
    Accelerating difficulty estimation for conformal regression forests2017In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 81, no 1-2, p. 125-144Article in journal (Refereed)
    Abstract [en]

    The conformal prediction framework allows for specifying the probability of making incorrect predictions by a user-provided confidence level. In addition to a learning algorithm, the framework requires a real-valued function, called nonconformity measure, to be specified. The nonconformity measure does not affect the error rate, but the resulting efficiency, i.e., the size of output prediction regions, may vary substantially. A recent large-scale empirical evaluation of conformal regression approaches showed that using random forests as the learning algorithm together with a nonconformity measure based on out-of-bag errors normalized using a nearest-neighbor-based difficulty estimate, resulted in state-of-the-art performance with respect to efficiency. However, the nearest-neighbor procedure incurs a significant computational cost. In this study, a more straightforward nonconformity measure is investigated, where the difficulty estimate employed for normalization is based on the variance of the predictions made by the trees in a forest. A large-scale empirical evaluation is presented, showing that both the nearest-neighbor-based and the variance-based measures significantly outperform a standard (non-normalized) nonconformity measure, while no significant difference in efficiency between the two normalized approaches is observed. The evaluation moreover shows that the computational cost of the variance-based measure is several orders of magnitude lower than when employing the nearest-neighbor-based nonconformity measure. The use of out-of-bag instances for calibration does, however, result in nonconformity scores that are distributed differently from those obtained from test instances, questioning the validity of the approach. An adjustment of the variance-based measure is presented, which is shown to be valid and also to have a significant positive effect on the efficiency. For conformal regression forests, the variance-based nonconformity measure is hence a computationally efficient and theoretically well-founded alternative to the nearest-neighbor procedure.

  • 6.
    Boström, Henrik
    et al.
    Department of Computer and Systems Sciences, Stockholm University, Stockholm, Sweden.
    Linusson, Henrik
    Department of Information Technology, University of Borås, Borås, Sweden.
    Löfström, Tuwe
    Jönköping University, School of Engineering, JTH, Computer Science and Informatics, JTH, Jönköping AI Lab (JAIL). Department of Information Technology, University of Borås, Borås, Sweden.
    Johansson, Ulf
    Jönköping University, School of Engineering, JTH, Computer Science and Informatics, JTH, Jönköping AI Lab (JAIL).
    Accelerating difficulty estimation for conformal regression forests2017In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 81, no 1-2, p. 125-144Article in journal (Refereed)
    Abstract [en]

    The conformal prediction framework allows for specifying the probability of making incorrect predictions by a user-provided confidence level. In addition to a learning algorithm, the framework requires a real-valued function, called nonconformity measure, to be specified. The nonconformity measure does not affect the error rate, but the resulting efficiency, i.e., the size of output prediction regions, may vary substantially. A recent large-scale empirical evaluation of conformal regression approaches showed that using random forests as the learning algorithm together with a nonconformity measure based on out-of-bag errors normalized using a nearest-neighbor-based difficulty estimate, resulted in state-of-the-art performance with respect to efficiency. However, the nearest-neighbor procedure incurs a significant computational cost. In this study, a more straightforward nonconformity measure is investigated, where the difficulty estimate employed for normalization is based on the variance of the predictions made by the trees in a forest. A large-scale empirical evaluation is presented, showing that both the nearest-neighbor-based and the variance-based measures significantly outperform a standard (non-normalized) nonconformity measure, while no significant difference in efficiency between the two normalized approaches is observed. The evaluation moreover shows that the computational cost of the variance-based measure is several orders of magnitude lower than when employing the nearest-neighbor-based nonconformity measure. The use of out-of-bag instances for calibration does, however, result in nonconformity scores that are distributed differently from those obtained from test instances, questioning the validity of the approach. An adjustment of the variance-based measure is presented, which is shown to be valid and also to have a significant positive effect on the efficiency. For conformal regression forests, the variance-based nonconformity measure is hence a computationally efficient and theoretically well-founded alternative to the nearest-neighbor procedure. 

  • 7.
    Cantwell, John
    et al.
    KTH, School of Architecture and the Built Environment (ABE), Philosophy and History. SCAS Uppsala, Uppsala, Sweden.
    Rott, Hans
    SCAS Uppsala, Uppsala, Sweden ; University of Regensburg, Regensburg, Germany.
    Probability, coherent belief and coherent belief changes2019In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 87, p. 259-291Article in journal (Refereed)
    Abstract [en]

    This paper is about the statics and dynamics of belief states that are represented by pairs consisting of an agent’s credences (represented by a subjective probability measure) and her categorical beliefs (represented by a set of possible worlds). Regarding the static side, we argue that the latter proposition should be coherent with respect to the probability measure and that its probability should reach a certain threshold value. On the dynamic side, we advocate Jeffrey conditionalisation as the principal mode of changing one’s belief state. This updating method fits the idea of the Lockean Thesis better than plain Bayesian conditionalisation, and it affords a flexible method for adding and withdrawing categorical beliefs. We show that it fails to satisfy the traditional principles of Inclusion and Preservation for belief revision and the principle of Recovery for belief withdrawals, as well as the Levi and Harper identities. We take this to be a problem for the latter principles rather than for the idea of coherent belief change.

  • 8. Confalonieri, Roberto
    et al.
    Nieves, Juan Carlos
    Umeå University, Faculty of Science and Technology, Department of Computing Science.
    Osorio, Mauricio
    Vazquez-Salceda, Javier
    Dealing with explicit preferences and uncertainty in answer set programming2012In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 65, no 2-3, p. 159-198Article in journal (Refereed)
    Abstract [en]

    In this paper, we show how the formalism of Logic Programs with Ordered Disjunction (LPODs) and Possibilistic Answer Set Programming (PASP) can be merged into the single framework of Logic Programs with Possibilistic Ordered Disjunction (LPPODs). The LPPODs framework embeds in a unified way several aspects of common-sense reasoning, nonmonotonocity, preferences, and uncertainty, where each part is underpinned by a well established formalism. On one hand, from LPODs it inherits the distinctive feature of expressing context-dependent qualitative preferences among different alternatives (modeled as the atoms of a logic program). On the other hand, PASP allows for qualitative certainty statements about the rules themselves (modeled as necessity values according to possibilistic logic) to be captured. In this way, the LPPODs framework supports a reasoning which is nonmonotonic, preference- and uncertainty-aware. The LPPODs syntax allows for the specification of (1) preferences among the exceptions to default rules, and (2) necessity values about the certainty of program rules. As a result, preferences and uncertainty can be used to select the preferred uncertain default rules of an LPPOD and, consequently, to order its possibilistic answer sets. Furthermore, we describe the implementation of an ASP-based solver able to compute the LPPODs semantics.

  • 9. Dix, Juergen
    et al.
    Hansson, Sven Ove
    Philosophy and History, KTH, School of Architecture and the Built Environment (ABE), Philosophy and History of Technology, Philosophy.
    Kern-Isberner, Gabriele
    Simari, Guillermo R.
    Belief change and argumentation in multi-agent scenarios2016In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 78, no 3-4, p. 177-179Article in journal (Other academic)
  • 10.
    Doherty, Patrick
    et al.
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, KPLAB - Knowledge Processing Lab.
    Kvarnström, Jonas
    Linköping University, The Institute of Technology. Linköping University, Department of Computer and Information Science, KPLAB - Knowledge Processing Lab.
    TALplanner: A temporal logic based forward chaining planner2000In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 30, no 1-4, p. 119-169Article in journal (Refereed)
    Abstract [en]

    We present TALplanner, a forward-chaining planner based on the use of domain-dependent search control knowledge represented as formulas in the Temporal Action Logic (TAL). TAL is a narrative based linear metric time logic used for reasoning about action and change in incompletely specified dynamic environments. TAL is used as the formal semantic basis for TALplanner, where a TAL goal narrative with control formulas is input to TALplanner which then generates a TAL narrative that entails the goal and control formulas. The sequential version of TALplanner is presented. The expressivity of plan operators is then extended to deal with an interesting class of resource types. An algorithm for generating concurrent plans, where operators have varying durations and internal state, is also presented. All versions of TALplanner have been implemented. The potential of these techniques is demonstrated by applying TALplanner to a number of standard planning benchmarks in the literature.

  • 11.
    Duracz, Jan
    et al.
    Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), Halmstad Embedded and Intelligent Systems Research (EIS), Centre for Research on Embedded Systems (CERES).
    Konečný, Michal
    School of Engineering and Applied Science, Aston University, Birmingham, United Kingdom.
    Polynomial Function Intervals for Floating-Point Software Verification2014In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 70, no 4, p. 351-398Article in journal (Refereed)
    Abstract [en]

    The focus of our work is the verification of tight functional properties of numerical programs, such as showing that a floating-point implementation of Riemann integration computes a close approximation of the exact integral. Programmers and engineers writing such programs will benefit from verification tools that support an expressive specification language and that are highly automated. Our work provides a new method for verification of numerical software, supporting a substantially more expressive language for specifications than other publicly available automated tools. The additional expressivity in the specification language is provided by two constructs. First, the specification can feature inclusions between interval arithmetic expressions. Second, the integral operator from classical analysis can be used in the specifications, where the integration bounds can be arbitrary expressions over real variables. To support our claim of expressivity, we outline the verification of four example programs, including the integration example mentioned earlier. A key component of our method is an algorithm for proving numerical theorems. This algorithm is based on automatic polynomial approximation of non-linear real and real-interval functions defined by expressions. The PolyPaver tool is our implementation of the algorithm and its source code is publicly available. In this paper we report on experiments using PolyPaver that indicate that the additional expressivity does not come at a performance cost when comparing with other publicly available state-of-the-art provers. We also include a scalability study that explores the limits of PolyPaver in proving tight functional specifications of progressively larger randomly generated programs.

  • 12.
    Eklund, Martin
    et al.
    Uppsala University, Disciplinary Domain of Medicine and Pharmacy, Faculty of Pharmacy, Department of Pharmaceutical Biosciences.
    Norinder, Ulf
    Boyer, Scott
    Carlsson, Lars
    The application of conformal prediction to the drug discovery process2015In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 74, no 1-2, p. 117-132Article in journal (Refereed)
    Abstract [en]

    QSAR modeling is a method for predicting properties, e.g. the solubility or toxicity, of chemical compounds using machine learning techniques. QSAR is in widespread use within the pharmaceutical industry to prioritize compounds for experimental testing or to alert for potential toxicity during the drug discovery process. However, the confidence or reliability of predictions from a QSAR model are difficult to accurately assess. We frame the application of QSAR to preclinical drug development in an off-line inductive conformal prediction framework and apply it prospectively to historical data collected from four different assays within AstraZeneca over a time course of about five years. The results indicate weakened validity of the conformal predictor due to violations of the randomness assumption. The validity can be strengthen by adopting semi-off-line conformal prediction. The non-randomness of the data prevents exactly valid predictions but comparisons to the results of a traditional QSAR procedure applied to the same data indicate that conformal predictions are highly useful in the drug discovery process.

  • 13.
    Flener, Pierre
    et al.
    Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computing Science.
    Pearson, Justin
    Uppsala University, Disciplinary Domain of Science and Technology, Mathematics and Computer Science, Department of Information Technology, Computer Systems.
    Sellmann, Meinolf
    Static and dynamic structural symmetry breaking2009In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 57, no 1, p. 37-57Article in journal (Refereed)
  • 14.
    Hegner, Stephen
    Umeå University, Faculty of Science and Technology, Department of Computing Science.
    Information-based distance measures and the canonical reflection of view updates2011In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 63, no 3-4, p. 317-355Article in journal (Refereed)
    Abstract [en]

    For the problem of reflecting an update on a database view to the main schema, the constant-complement strategies are precisely those which avoid all update anomalies, and so define the gold standard for well-behaved solutions to the problem. However, the families of view updates which are supported under such strategies are limited, so it is sometimes necessary to go beyond them, albeit in a systematic fashion. In this work, an investigation of such extended strategies is initiated for relational schemata. The approach is to characterize the information content of a database instance, and then require that the optimal reflection of a view update to the main schema embody the least possible change of information. The key property is identified to be strong monotonicity of the view, meaning that view insertions may always be reflected as insertions to the main schema, and likewise for deletions. In that context it is shown that for insertions and deletions, an optimal update, entailing the least change of information, exists and is unique up to isomorphism for wide classes of constraints.

  • 15.
    Hegner, Stephen J.
    Umeå University, Faculty of Science and Technology, Department of Computing Science.
    Constraint-preserving snapshot isolation2016In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 76, no 3-4, p. 281-326Article in journal (Refereed)
    Abstract [en]

    A method for detecting potential violations of integrity constraints of concurrent transactions running under snapshot isolation (SI) is presented. Although SI provides a high level of isolation, it does not, by itself, ensure that all integrity constraints are satisfied. In particular, while current implementations of SI enforce all internal integrity constraints, in particular key constraints, they fail to enforce constraints implemented via triggers. One remedy is to turn to serializable SI (SSI), in which full serializability is guaranteed. However, SSI comes at the price of either a substantial number of false positives, or else a high cost of constructing the full direct serialization graph. In this work, a compromise approach, called constraint-preserving snapshot isolation (CPSI), is developed, which while not guaranteeing full serializability, does guarantee that all constraints, including those enforced via triggers, are satisfied. In contrast to full SSI, CPSI requires testing concurrent transactions for conflict only pairwise, and thus involves substantially less overhead while providing a foundation for resolving conflicts via negotiation rather than via abort and restart. As is the case with SSI, CPSI can result in false positives. To address this, a hybrid approach is also developed which combines CPSI with a special version of SSI called CSSI, resulting in substantially fewer false positives than would occur using either approach alone.

  • 16.
    Klügl, Franziska
    Örebro University, School of Science and Technology.
    Using the affordance concept for model design in agent-based simulation2016In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 78, no 1, p. 21-44Article in journal (Refereed)
    Abstract [en]

    When designing an Agent-Based Simulation Model a central challenge is to formulate the appropriate interactions between agents as well as between agents and their environment. In this contribution we present the idea of capturing agent-environment interactions based on the “affordance” concept. Originating in ecological psychology, affordances represent relations between environmental objects and potential actions that agents may perform using those objects. We assume that explicitly handling affordances based on semantic annotation of entities in simulated space may offer a higher abstraction level for dealing with potential interaction. Our approach has two elements: firstly a methodology for using the affordance concept to identify interactions and secondly a suggestion for integrating affordances into agents’ decision making. We illustrate our approach indicating an agent-based model of after-earthquake behavior.

  • 17.
    Laxhammar, Rikard
    et al.
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre. Saab Security and Defence Solutions, Järfälla, Sweden.
    Falkman, Göran
    University of Skövde, School of Informatics. University of Skövde, The Informatics Research Centre.
    Inductive conformal anomaly detection for sequential detection of anomalous sub-trajectories2015In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 74, no 1-2, p. 67-94Article in journal (Refereed)
  • 18.
    Marinakis, Yannis
    et al.
    School of Production Engineering and Management, Technical University of Crete, Decision Support Systems Laboratory, Department of Production Engineering and Management, Technical University of Crete.
    Marinaki, Magdalene
    School of Production Engineering and Management, Technical University of Crete.
    Migdalas, Athanasios
    Luleå University of Technology, Department of Business Administration, Technology and Social Sciences, Business Administration and Industrial Engineering.
    A hybrid clonal selection algorithm for the location routing problem with stochastic demands2016In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 76, no 1-2, p. 121-142Article in journal (Refereed)
    Abstract [en]

    In this paper, a new formulation of the Location Routing Problem with Stochastic Demands is presented. The problem is treated as a two phase problem where in the first phase it is determined which depots will be opened and which customers will be assigned to them while in the second phase, for each of the open depots a Vehicle Routing Problem with Stochastic Demands is solved. For the solution of the problem a Hybrid Clonal Selection Algorithm is applied, where, in the two basic phases of the Clonal Selection Algorithm, a Variable Neighborhood Search algorithm and an Iterated Local Search algorithm respectively have been utilized. As there are no benchmark instances in the literature for this form of the problem, a number of new test instances have been created based on instances of the Capacitated Location Routing Problem. The algorithm is compared with both other variants of the Clonal Selection Algorithm and other evolutionary algorithms.

  • 19.
    Odelstad, Jan
    et al.
    University of Gävle, Department of Mathematics, Natural and Computer Sciences, Ämnesavdelningen för datavetenskap.
    Boman, M
    Algebras for Agent Norm-Regulation2004In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 42, no 1-3, p. 141-160Article in journal (Refereed)
    Abstract [en]

    An abstract architecture for idealized multi-agent systems whose behaviour is regulated by normative systems is developed and discussed. Agent choices are determined partially by the preference ordering of possible states and partially by normative considerations: The agent chooses that act which leads to the best outcome of all permissible actions. If an action is non-permissible depends on if the result of performing that action leads to a state satisfying a condition which is forbidden, according to the norms regulating the multi-agent system. This idea is formalized by defining set-theoretic predicates characterizing multi-agent systems. The definition of the predicate uses decision theory, the Kanger-Lindahl theory of normative positions, and an algebraic representation of normative systems.

  • 20. Odelstad, Jan
    et al.
    Boman, Magnus
    KTH, Superseded Departments, Computer and Systems Sciences, DSV.
    Algebras for Agent Norm-Regulation2004In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 42, no 1-3, p. 141-166Article in journal (Refereed)
    Abstract [en]

    An abstract architecture for idealized multi-agent systems whose behaviour is regulated by normative systems is developed and discussed. Agent choices are determined partially by the preference ordering of possible states and partially by normative considerations: The agent chooses that act which leads to the best outcome of all permissible actions. If an action is non-permissible depends on if the result of performing that action leads to a state satisfying a condition which is forbidden, according to the norms regulating the multi-agent system. This idea is formalized by defining set-theoretic predicates characterizing multi-agent systems. The definition of the predicate uses decision theory, the Kanger–Lindahl theory of normative positions, and an algebraic representation of normative systems.

  • 21. Odelstad, Jan
    et al.
    Boman, Magnus
    RISE, Swedish ICT, SICS, Decisions, Networks and Analytics lab.
    Algebras for agent norm-regulation2004In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 42, no 1-3, p. 141-166Article in journal (Refereed)
    Abstract [en]

    An abstract architecture for idealized multi-agent systems whose behaviour is regulated by normative systems is developed and discussed. Agent choices are determined partially by the preference ordering of possible states and partially by normative considerations: The agent chooses that act which leads to the best outcome of all permissible actions. If an action is non-permissible depends on if the result of performing that action leads to a state satisfying a condition which is forbidden, according to the norms regulating the multi-agent system. This idea is formalized by defining set-theoretic predicates characterizing multi-agent systems. The definition of the predicate uses decision theory, the Kanger-Lindahl theory of normative positions, and an algebraic representation of normative systems.

  • 22.
    Pei, Jun
    et al.
    School of Management, Hefei University of Technology.
    Liu, Xinbao
    School of Management, Hefei University of Technology.
    Fan, Wenjuan
    School of Management, Hefei University of Technology.
    Pardalos, Panos M.
    Department of Industrial and Systems Engineering, Center for Applied Optimization, University of Florida.
    Migdalas, Athanasios
    Luleå University of Technology, Department of Business Administration, Technology and Social Sciences, Business Administration and Industrial Engineering.
    Yang, Shanlin
    School of Management, Hefei University of Technology.
    Scheduling jobs on a single serial-batching machine with dynamic job arrivals and multiple job types2016In: Annals of Mathematics and Artificial Intelligence, ISSN 1012-2443, E-ISSN 1573-7470, Vol. 76, no 1-2, p. 215-228Article in journal (Refereed)
    Abstract [en]

    This paper investigates a scheduling model with certain co-existing features of serial-batching, dynamic job arrival, multi-types of job, and setup time. In this proposed model, the jobs of all types are first partitioned into serial batches, which are then processed on a single serial-batching machine with an independent constant setup time for each new batch. In order to solve this scheduling problem, we divide it into two phases based on job arrival times, and we also derive and prove certain constructive properties for these two phases. Relying on these properties, we develop a two-phase hybrid algorithm (TPHA). In addition, a valid lower bound of the problem is also derived. This is used to validate the quality of the proposed algorithm. Computational experiments, both with small- and large-scale problems, are performed in order to evaluate the performance of TPHA. The computational results indicate that TPHA outperforms seven other heuristic algorithms. For all test problems of different job sizes, the average gap percentage between the makespan, obtained using TPHA, and the lower bound does not exceed 5.41 %.

1 - 22 of 22
CiteExportLink to result list
Permanent link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf