|14:00-14:45||Piotr Faliszewski (AGH University of Science and Technology, Krakow) |
How to Mix Multiwinner Voting Rules and Why?
Abstract: There are three idealized types of outcomes of multiwinner voting rules. The committees may either consist of individually excellent candidates (as needed, e.g., when choosing finalists of competitions), may be diverse (e.g., when we look for products to offer on a homepage of an Internet store), or may be proportional (e.g., when we want to form a parliament). However, in reality we may often be interested in committees that achieve various levels of compromise between these notions. For example, in degressively proportional parliamentary elections we want to increase the representation of parties with smaller support, at the expense of those with larger support. As a consequence, degressive proportionality can be seen as mixing the ideals of proportionality and diversity. In this talk I will discuss various ways of forming multiwinner voting rules that achieve such compromises and discuss their computational complexity. I will also show that the notion of proportionality is much more tricky than one might suspect.
|15:15-16:00|| Boas Kluiving (University of Amsterdam) |
Analysing Irresolute Multiwinner Voting Rules via SAT Solving
Abstract: Suppose we want to elect a committee or parliament by using a voting rule under which each voter is asked to approve of a subset of the candidates. There are several properties we may want such a voting rule to satisfy: it should ensure that voters do not have an incentive to misrepresent their preferences, that outcomes respect some form of proportional representation of the voters, and that outcomes are Pareto efficient. We show that it is impossible to design a voting rule that satisfies all three properties and explore what possibilities there are when we weaken our requirements. Of special interest is the methodology we use: part of the proof can be outsourced to a SAT (satisfiability) solver by translating an instance of the statement of our main theorem into a set of formulas in propositional logic. While prior work has considered similar questions for the special case of resolute voting rules, which do not allow for ties between outcomes, we focus on the fact that, in practice, most voting rules do allow for the possibility of such ties. This is joint work with Adriaan de Vries, Pepijn Vrijbergen, Arthur Boixel, and Ulle Endriss.
|16:30-17:15||Hans Peters (Maastricht University) |
Choosing k from m
Abstract: We show that feasible elimination procedures (Peleg, 1978) can be used to select k from m alternatives. An application is the choice of a committee of size k from a set of m of available candidates. An important advantage of this method is the core property: no coalition of voters can guarantee an outcome that is preferred by all its members. We also show that the problem of determining whether a specific k-tuple can result from a feasible elimination procedure is computationally equivalent to the problem of finding a maximal matching in a bipartite graph. Additionally, we provide an axiomatic characterization of the method of feasible elimination. This presentation is based on joint work with Bezalel Peleg.
|17:30||Drinks in a nearby café|
Location: Room T3-25, Mandeville Building, Campus Woudestein, Erasmus University Rotterdam (directions)
Hosts: Måns Abrahamson, Constanze Binder, Kirsten Rohde (Erasmus University Rotterdam)
|14:00-15:00||Kirsten Rohde (Erasmus University Rotterdam) |
Social Risk Attitudes for Health and Money
Abstract: Many risks have a social as well as an individual dimension. This paper studies attitudes towards social risks for health and money. In an experiment, participants are asked to report their certainty equivalents of risks that are distributed over individuals in society. In every society, all individuals face the same risks. Societies differ regarding the correlations of risks across individuals. Attitudes towards such correlations shed light onto people’s concerns for inequality and collective risk (Rohde and Rohde, 2015). In this study we compare decisions for social risks concerning health and money. We also compare social risks framed in terms of gains and losses. (This is joint work with Ingrid Rohde.)
|15:15-16:15||Wulf Gaertner (Universität Osnabrück) |
An Experimental Game of Loss Sharing
Abstract: We conduct a lab-experimental study of bargaining over the distribution of monetary losses. Groups of four differently endowed participants must agree, as a group, on the contribution each participant will make to cover a financial loss imposed on the group. The study sheds light on burden sharing and what kind of loss allocation rules groups are willing to adopt. Given the experimental results that we gathered, two rules stand out as possible explananda, the constrained equal awards rule which has been characterized by Moulin (2000) and Herrero and Villar (2001) and a new theoretical model that will be characterized in this paper. Many proposals that participants made are close to a progressive rule which exempts the person with the lowest endowment from any losses and puts a larger burden on those with higher endowments. Such proposals can be explained by our model. However, there is also a large number of proposals which suggest that the two lowest groups in terms of initial endowments be exempted from any loss sharing. Such proposals are in conformity with the constrained equal awards rule. (This is joint work with Lars Schwettmann and Yongsheng Xu.)
|16:30-17:30||Erik Schokkaert (KU Leuven) |
Empirical Social Choice with Incomplete and Unstable Preferences
Abstract: Different papers in empirical social choice have worked with different interpretations of the overall concept of "individual preferences" and/or have implicitly assumed that the preference relation is stable and complete. I will illustrate this for some of the findings obtained with large representative surveys and I will argue that a stronger theoretical framework is needed to make this research useful for normative analysis. Questionnaire-experimental work fares better in this regard but has also largely neglected that preferences may be unstable and/or incomplete. I propose to add a deliberation stage to the empirical setup of these studies. I finally discuss the pros and cons of survey experimental work with representative populations and incentivized respondents.
Note: This session of the DSCC is organised jointly with the Tinbergen Institute.
|13:30-14:15||Bettina Klaus (University of Lausanne) |
Random Matching under Priorities: Stability and No Envy Concepts
Abstract: We consider stability concepts for random matchings where agents have preferences over objects and objects have priorities for the agents. When matchings are deterministic, the standard stability concept also captures the fairness property of no (justified) envy. When matchings can be random, there are a number of natural stability / fairness concepts that coincide with stability / no envy whenever matchings are deterministic. We formalize known stability concepts for random matchings for a general setting that allows weak preferences and weak priorities, unacceptability, and an unequal number of agents and objects. We then present a clear taxonomy of the stability concepts and identify logical relations between them. Furthermore, we provide no envy / claims interpretations for some of the stability concepts that are based on a consumption process interpretation of random matchings. Finally, we present a transformation from the most general setting to the most restricted setting, and show how almost all our stability concepts are preserved by that transformation. (Based on joint work with Haris Aziz.)
|14:15-15:00||Flip Klijn (Institute for Economic Analysis, Barcelona) |
Approaching Mutually Best in Matching Markets: Rank-Fairness and Size of the Core
Abstract: This paper studies the one-to-one two-sided marriage model of Gale and Shapley (1962). If agents' preferences exhibit mutually best, there is a unique stable matching that is trivially rank-fair (i.e., in each matched pair the agents assign one another the same rank). We introduce two types of distances that measure to what extent a preference profile violates mutually best. We study whether the size of the core and the rank-unfairness of stable matchings can be bounded in terms of the distances to mutually best. Our findings are negative if we do not make additional assumptions on the domain of preferences profiles. We obtain positive results on the domain of horizontal heterogeneity. (Based on joint work with Christopher Kah and Markus Walzl.)
|15:30-16:15||Kristof Bosmans (Maastricht University) |
Failure to Compensate or Failure to Reward? A Decomposition of Inequality of Opportunity
Abstract: We decompose inequality of opportunity into compensation and reward components. The former component measures the unfair inequality due to circumstances and the latter component measures the deviation from the fair inequality stemming from the exercise of responsibility. Our analysis illuminates the connection between the liberal and utilitarian approaches to inequality of opportunity measurement. (Based on joint work with Z. Emel Öztürk.)
|16:15-17:00||Ton Storcken (Maastricht University) |
An Axiomatic Characterization of Slater Rule and Kemeny Rule
Abstract: Slater rule and Kemeny rule are characterized by similar sets of conditions. Moreover, these characterizations are deduced along a similar line of proof. The characterizing conditions are neutrality, a monotonicity condition, either being tournamental or weighed tournamental and either moderately swinging or slowly swinging. Being (weighed) tournamental relates a collective decision rule to the pairwise (weighed) majority relations. Moderately and slowly swinging demands that these rules gradually change their outcomes. We also address the independence of these conditions. (Based on joint work with Burak Can and Mohsen Pourpouneh.)
|17:15-18:00||Jordi Massó (Universitat Autònoma de Barcelona) |
On Strategy-Proofness and Semilattice Single-Peakedness
Abstract: We study social choice rules defined on the domain of semilattice single-peaked preferences. We characterize the class of strategy-proof rules that are also tops-only, anonymous and unanimous. These rules are deeply related to the supremum of the underlying semilattice structure. (Based on joint work with Agustín Bonifacio.)
Location: Doelenzaal, University Library, Singel 425, Amsterdam (directions)
Hosts: Ulle Endriss, Ronald de Haan (University of Amsterdam)
|13:30-14:00||Coffee will be available in the lobby.|
|14:00-14:40|| Zoi Terzopoulou (University of Amsterdam) |
Aggregating Incomplete Judgments: Axiomatisations for Scoring Rules
Abstract: Judgment aggregation is a formal framework for collective decision making regarding logically interconnected issues. In this framework, the case in which the members of a group can choose to provide a judgment for only some of the issues at stake is significantly underexplored. A natural class of aggregation rules in this context are scoring rules for which an individual's weight in the decision making process is determined by the number of issues she provides a judgment for. We formulate several appealing axioms for aggregating incomplete judgments and show how they characterise specific rules within this class of scoring rules. The talk is based on joint work with Ulle Endriss and Ronald de Haan.
|15:00-15:40|| Daniele Porello (Free University of Bozen-Bolzano) |
Social Mechanisms for the Collective Engineering of Ontologies
Abstract: Ontologies are important complex devices to do knowledge representation of a certain domain that are increasingly being used in a variety of applications. Building ontologies collaboratively presents the advantage of allowing practitioners to share their expertise in the modelling of a domain. However, collaborative ontology engineering, seen as a form of knowledge integration, is prone to inconsistencies. We propose two techniques to deal with this situation. First, we study how to repair an inconsistent collective ontology that results from the views of heterogeneous experts, once they have been aggregated by means of voting. Second, we prevent the creation of any inconsistencies by letting the experts engage in a turn-based rational negotiation about the information to be added to the collective ontology.
|16:00-16:40|| Baharak Rastegari (University of Bristol) |
Stable Matching with Uncertain Preferences
Abstract: We consider the two-sided stable matching setting in which there may be uncertainty about the agents' preferences due to limited information or communication. We consider four models of uncertainty: lottery model, compact indifference model, joint probability model, and pairwise probability model. For each of these models, we study the computational complexity of computing the stability probability of a given matching as well as finding a matching with the highest probability of being stable. We also examine more restricted problems such as deciding whether a certainly stable matching exists. We find a rich complexity landscape for these problems, indicating that the form uncertainty takes is significant.
|17:00-17:40|| Nicolas Maudet (Pierre and Marie Curie University, Paris) |
Envy-Free Allocations on Graphs: An Overview of Recent Results
Abstract: Fairly allocating indivisible items to agents is a fundamental problem in computer science and economics. Envy-freeness, in particular, is a well-studied notion of fairness that requires that no agent would prefer the bundle held by another agent over their own bundle. Recently, several works have studied models where the assumption of full observability of agents (in the sense that they can see the bundle of any other agent) is relaxed. Indeed, many applications exhibit spatial or temporal constraints which make this assumption unrealistic. After a quick overview of these works, I will concentrate on one of the most basic instances of such settings, namely house allocation problems (i.e., each agent must receive exactly one item). We will more specifically discuss two problems: (i) for a given preference profile and on a given graph where agents are located, deciding whether there exists an envy-free allocation, and (ii) the variant where the designer can also decide on the location of agents. We will also report on experiments studying the likelihood of finding an envy-free allocation on specific graphs topologies. This is joint work with Aurélie Beynier, Yann Chevaleyre, Laurent Gourvès, Julien Lesca, and Anaëlle Wilczynski.
Location: Room H9-02, Tinbergen Building, Campus Woudestein, Erasmus University Rotterdam (directions)
Hosts: Constanze Binder, Benoît Crutzen, Dana Sisak (Erasmus University Rotterdam)
|15:00-16:00||Sacha Kapoor (Erasmus University Rotterdam) |
The Cost of Political Representation
Abstract: We estimate the causal effect of independent candidates on voter turnout and election outcomes in India. To do this, we exploit exogenous changes in the entry deposit candidates pay for their participation in the political process, changes that disproportionately excluded candidates with no affiliation to established political parties. A one standard deviation increase in the number of independent candidates increases voter turnout by 5-6 percentage points, as some voters choose to vote rather than stay home. The vote share of independent candidates increases by 9-10 percentage points, as some existing voters switch who they vote for. Thus, independents allow winning candidates to win with less vote share, decrease the probability of electing a candidate from the governing coalition by about 27-30 percentage points, and ultimately increase the probability of electing an ethnic-party candidate. Altogether, the results imply that the price of participation by independents is constituency representation in government.
|16:15-17:15||Stephane Wolton (London School of Economics) |
Wisdom of the Crowd? Information Aggregation in Representative Democracy
Abstract: In representative democracy, voters elect candidates who strategically propose policies. In a common value environment with imperfectly informed voters and candidates, we establish that intermediation by candidates can render information aggregation unfeasible even when a large electorate presented with exogenous options would almost always select the correct policy. In fact, the possibility of information aggregation encourages candidates' conformism and stifles the competition among ideas. Neither liberalizing access to candidacy nor introducing additional frictions in voters' information guarantees feasible information aggregation. Thus, the political failure we uncover is due to the intermediation by candidates — that is, the nature of representative democracy.
|17:30-18:30||Giacomo Ponzetto (CREI, Universitat Pompeu Fabra) |
Fundamental Errors in the Voting Booth
Abstract: Psychologists have long documented that we over-attribute people's actions to innate characteristics, rather than to luck or circumstances. Similarly, economists have found that both politicians and businessmen are rewarded for luck. In this paper, we introduce this "Fundamental Attribution Error" into two benchmark political economy models. In both models, voter irrationality can improve politicians' behavior, because voters attribute good behavior to fixed attributes that merit reelection. This upside of irrationality is countered by suboptimal leader selection, including electing leaders who emphasize objectives that are beyond their control. The error has particularly adverse consequences for institutional choice, where it generates too little demand for a free press, too much demand for dictatorship, and responding to endemic corruption by electing new supposedly honest leaders, instead of investing in institutional reform.
Note: This session of the DSCC is organised jointly with the Tinbergen Institute.
Location: Room C2-1, Theil Building, Campus Woudestein, Erasmus University Rotterdam (directions)
Host: Constanze Binder (Erasmus University Rotterdam)
|14:00-15:00||Vladimir Karamychev (Erasmus University Rotterdam) |
Do Spectrum Auctions Get It Right? Auction Prices and Budget Constraints
Abstract: Auction theory is typically presented as the prime example of game theory success. However, when players' (bidders') preferences are slightly different from what the theory predominantly assumes, outcomes of auction games may differ quite remarkably from what the theory predicts. In this research review, which is a compilation of several research articles, we will see whether spectrum auctions `get it right' the sense of whether they generate market prices that the authorities are expecting (according to theory) and whether bidding under budget constraint in spectrum auctions is just bidding under a bid cap.
|15:15-16:15||Rudolf Müller (Maastricht University) |
Characterizing Implementable Allocation Rules in Multi-dimensional Environments
Abstract: We study characterizations of implementable allocation rules when types are multi-dimensional, monetary transfers are allowed, and agents have quasi-linear preferences over outcomes and transfers. Every outcome is associated with a valuation function that maps an agent's type to his value for this outcome. The set of types is assumed to be convex. Our main characterization theorem shows that allocation rules are implementable if and only if they are implementable on any two-dimensional convex subset of the type set. For finite sets of outcomes and continuous valuation functions, they are implementable if and only if they are implementable on every one-dimensional subset of the type set. This extends a characterization result by Saks and Yu (EC-2005) from models with linear valuation functions to arbitrary continuous valuation functions, and provides a simple proof of their result. Modeling multi-dimensional mechanism design the way we propose it here is of relevance whenever types are given by few parameters, while the set of possible outcomes is large, and when values for outcomes are non-linear functions in types. This is joint work with André Berger and Seyed Hossein Naeemi.
|16:30-17:30||John Weymark (Vanderbilt University) |
Dominant Strategy Implementability and Zero Length Cycles
Abstract: Necessary conditions for dominant strategy implementability of an allocation function on a restricted type space are identified when utilities are quasilinear and the set of alternatives is finite. For any one-person mechanism obtained by fixing the other individuals' types, the geometry of the partition of the type space into subsets that are allocated the same alternative is used to identify situations in which it is necessary for all of the cycle lengths in the corresponding allocation graph to be zero. When all cycle lengths are zero, it is shown that the allocation function can be implemented by setting the payment for an alternative equal to the average length of the arcs that terminate at its node in the allocation graph. This is joint work with Paul H. Edelman.
|13:30-14:00||Coffee will be available in the lobby.|
|14:00-14:10||Opening of the Special Session|
|14:10-15:00|| Salvador Barberà (Universitat Autònoma de Barcelona) |
Incentives and Domain Restrictions
Abstract: After briefly discussing some of the many research avenues that Kenneth Arrow did open, we'll concentrate on the issue of domain restrictions and its implications on the study of incentives. Specifically, we'll study a notion of non-manipulability by groups, based on the idea that only some agreements among potential manipulators may be credible, and its implications for the design of voting rules operating in the domain of multi-dimensional single-peaked preferences. The derived notion of immunity to credible manipulations by groups is intermediate between individual and group strategy-proofness. Our main non-recursive definition turns out to be equivalent, in our context, to the requirement that truthful preference revelation should be a strong coalition-proof equilibrium, as recursively defined by Peleg and Sudhölter (Review of Economic Design, 1999). We provide characterizations of strategy-proof rules separating those that satisfy it from those that do not for a large family of public good decision problems. This is joint work with Dolors Berga and Bernardo Moreno.
|15:30-16:20|| Herrade Igersheim (CNRS & Université de Strasbourg) |
The Death of Welfare Economics and the Emergence of Social Choice Theory: History of a Controversy
Abstract: The death of welfare economics has been declared several times. One of the reasons cited for these plural obituaries is that Kenneth Arrow's impossibility theorem, as set out in his path-breaking Social Choice and Individual Values in 1951, has shown that the social welfare function—one of the main concepts of the new welfare economics as defined by Abram Bergson (Burk) in 1938 and clarified by Paul Samuelson in the Foundations of Economic Analysis (1947)—does not exist under reasonable conditions. Indeed, from the very start, Arrow kept asserting that his famous impossibility result has direct and devastating consequences for the Bergson-Samuelson Social Welfare Function, though he seemed to soften his position in the early eighties. On his side, especially from the seventies on, Samuelson remained active on this issue and continued to defend the concept he had devised with Bergson, tooth and nail, against Arrow's attacks. In this talk, we will examine this rather strange controversy, which is almost unknown in the scientific community, even though it lasted more than fifty years and saw a conflict between two economic giants, Arrow and Samuelson, and behind them two distinct communities—the fading welfare economics against the emerging social choice theory—, two conflicting ways of dealing with mathematical tools in welfare economics and, above all, two different conceptions of social welfare.
|16:50-17:40|| Hans Peters (Maastricht University) |
On Condorcet Consistency and the Participation Paradox
Abstract: In this talk we consider two conditions on social choice correspondences. The first is Condorcet consistency: if there are alternatives that (weakly) beat every other alternative in pairwise comparison, then the correspondence should select exactly those alternatives. The second is that the correspondence should not exhibit the participation paradox, which in our case can be of two types: if an additional voter ranks a winning candidate at top, then that candidate may loose, and if an additional voter ranks a loosing candidate at bottom, then that candidate becomes winning. We characterize the maximal Condorcet consistent correspondence that does not exhibit the participation paradox, and investigate which single-valued selections (social choice functions) inheriting both conditions are possible. The talk is inspired by a recent paper of Felsenthal and Nurmi, presented about a year ago at the DSCC, and is based on joint work with Laura Kasper and Dries Vermeulen.
|13:00-13:50||Matthias Mnich (Maastricht University) |
Stable Marriage with Covering Constraints: A Complete Computational Trichotomy
Abstract: We consider stable marriage problems equipped with covering constraints: here the input distinguishes a subset of women, and we seek a matching with fewest number of blocking pairs that matches all of the distinguished women. This generalizes the notion of arranged marriages introduced by Knuth in 1976, in which the partner of each distinguished person is fixed a priori. Our main result is a complete computational complexity trichotomy of the stable marriage problem with covering constraints, into polynomial-time solvable cases, fixed-parameter tractable cases, and cases that are W-hard, for every choice among a set of natural parameters, namely the maximum length of preference lists for men and women, the number of blocking pairs allowed, and the number of distinguished women. Thereby, we fully answer an open problem of Hamada et al. (ESA-2011). This is joint work with Ildikó Schlotter.
|13:50-14:40||Hessel Oosterbeek (University of Amsterdam) |
The Performance of School Assignment Mechanisms in Practice
Abstract: Theory points to a potential trade-off between two main school assignment mechanisms: Boston and Deferred Acceptance (DA). While DA is strategy-proof and gives a stable matching, Boston might outperform DA in terms of ex-ante efficiency. We quantify the trade-offs between the mechanisms by using information about actual choices under (adaptive) Boston complemented with survey data eliciting students' school preferences. We find that under Boston around 8 percent of the students apply in the first round to another school than their most-preferred school. We compare allocations resulting from Boston with DA with single tie-breaking (one central lottery; DA-STB) and multiple tie-breaking (separate lottery per school; DA-MTB). DA-STB places more students in their top-n schools, for any n, than Boston. DA-STB and Boston place more students in their single most-preferred school than DA-MTB, but fewer in their top-n, for n > 1. In terms of ex-ante efficiency, a majority of students is better off under Boston than under DA, while average wel- fare is higher (equivalent to a reduction in the home-school distance by 10 percent) under DA-STB than under Boston. Finally, students from disadvantaged backgrounds benefit most from a switch from Boston to one of the DA mechanisms. This is joint work with Monique de Haan, Pieter Gautier, and Bas van der Klaauw.
|15:00-15:50||Dominik Karos (Maastricht University) |
Informational Cascades Revisited: The Human Factor of Protest Dynamics
Abstract: A model of coordinated behavior in protests is provided and two testable implications are derived. First, repressions have an ambiguous effect on protest outcomes unless they are very harsh; second, the probability that an anti-government protest turns into a successful revolution is higher under repressive than under democratic regimes. Both are true independently of the underlying social networks and the agents' preferences. The implications of the provided model are illustrated using data on protests, revolutions, and political terror worldwide between 1976 and 2014.
|15:50-16:40||Emre Ergin (Maastricht University) |
How to Choose a Delegation for a Peace Conference?
Abstract: This paper analyzes how to choose a delegation, a committee to represent a society such as in a peace conference, by proposing normative conditions. We seek optimal, consistent, neutral and non-manipulable ways to choose a delegation. We show that a class of threshold rules are characterized by these criteria. The rules impose that the combined support for the delegation should exceed a particular percentage of the public opinion depending on the size of the delegation. This is joint work with Burak Can and Péter Csóka.
René van den Brink,
Location: University Theatre (1.01A), Nieuwe Doelenstraat 16, Amsterdam (directions)
Host: Ulle Endriss (University of Amsterdam)
|14:15-15:00||Sirin Botan (University of Amsterdam) |
Propositionwise Updates in Opinion Diffusion
Abstract: We study the diffusion of opinions on a social network as an iterated process of aggregating neighboring opinions and updating opinions accordingly. Individual views are modeled as vectors of yes/no answers to a number of propositions connected by an integrity constraint. We propose a diffusion model based on individual propositionwise updates, where each agent changes her opinion on a single proposition at a time, that guarantees the satisfaction of the integrity constraint at every iteration step. Our focus will be a syntactic characterization of those integrity constraints that provide individuals with a maximal number of updates. We also provide an axiomatic analysis of the diffusion process grounded on literature in judgment aggregation. This is joint work with Umberto Grandi and Laurent Perrussel.
|15:00-15:45|| Zoé Christoff (University of Liverpool) |
From Proxy Voting to DeGroot (and Back)
Abstract: This paper brings closer together the voting system known as "proxy voting", where each agent chooses either to express his own opinion or to delegate to another agent, and the "DeGroot model" of opinion diffusion in a social network, where each agent iteratively updates his opinion under the weighted influence of others. We first propose an analysis of proxy voting from a judgment aggregation perspective. In particular, we show how proxy voting can be embedded into the framework of binary judgment aggregation with abstention, allowing known results about the latter to apply to the former. We then turn to an analysis of "Boolean DeGroot processes", a limit case of the DeGroot stochastic model where each agent holds binary opinions and has a unique influencer. We establish the convergence conditions of such opinion dynamics, which in turn enlighten our discussion of proxy voting. This is joint work with Davide Grossi.
|16:15-17:00|| Emily Tanimura (Centre d'Economie de la Sorbonne) |
Strategic Influence in Social Networks
Abstract: We consider a model of influence with a set of non-strategic agents and two strategic agents. The non-strategic agents have initial opinions and are linked through a network. They update their opinions as in the DeGroot model. The two strategic agents have fixed and opposed opinions. They each form a link with a non-strategic agent in order to influence the average opinion that emerges due to interactions in the network. This procedure defines a zero-sum game whose players are the two strategic agents and whose strategy set is the set of non-strategic agents. We focus on the existence and the characterization of pure strategy equilibria in this setting. The existence of a pure strategy equilibrium is shown to depend on the structure of the network. Our characterization of equilibrium emphasizes two features: the influenceability of target agents and on the other hand their centrality whose characterization in our context induces a new notion that we call intermediacy. This is joint work with Michel Grabisch, Antoine Mandel, and Agnieszka Rusinowska.
|17:00-17:45|| René van den Brink (VU Amsterdam) |
Network Structures with Hierarchy and Communication
Abstract: Agents participating in different kinds of organizations, usually take different positions in some relational structure. Two well-known relational structures are hierarchies and communication networks. The aim of this paper is to introduce a new type of network structure having both communication and hierarchical features. We describe a network by the feasible sets of network positions (nodes). We introduce and analyze accessible union stable network structures. Union stability means that the union of two non-disjoint feasible sets is also feasible. It reflects communication in the sense that positions that belong to both sets can act as intermediaries and make the union feasible. Accessibility means that every feasible set has at least one node such that without this node the set is still feasible. This property reveals some asymmetry between the node that can be deleted and the other nodes and shows a hierarchical feature. This is joint work with Encarna Algaba and Chris Dietz.
|18:00||Drinks in a nearby café|
Location: Room C2-1, Theil Building, Campus Woudestein, Erasmus University Rotterdam (directions)
Host: Constanze Binder (Erasmus University Rotterdam)
|14:00-15:00||Bauke Visser (Erasmus University Rotterdam) |
Reputation Management and Assessment in the Lab
Abstract: In a 'reputation game', reputation-concerned agents use decisions and accompanying statements to influence assessments of their competence, and evaluators take such attempts into account when assessing them. We test the theoretical implications in the lab by comparing treatments with and without reputation concerns, and with and without statements. Reputation concerns make statements less informative about competence and, in turn, assessments of competence less responsive to decisions and statements. Evaluators overly react to infrequent statements and are generally too tough. Agents are less inclined to use decisions to influence assessments when they can also use statements.
|15:15-16:15||Thomas Boyer-Kassem (Tilburg University) |
No Need for a Secret Ballot? How to Reduce Reputational Cascades in Expert Committees
Abstract: People sometimes misrepresent their opinions because others have expressed opposite views and public disagreement comes with various costs. This falsification of preferences may also affect experts in committees, who may align with others' already expressed opinions. To assess the importance of reputational cascades, we propose a simple model of a sequential deliberation. The model enables us to analyse the influence of various parameters, and suggests three ways to reduce the effects of preference falsification: (i) allow experts to express fine-grained opinions instead of binary ones; (ii) have experts speak in specific orders; (iii) hold a sufficient number of table rounds. Thus, effects of reputational concerns could be decreased, even in a sequential and non-secret voting procedure.
|16:30-17:30||Franz Dietrich (Paris School of Economics) |
Abstract: A group is often construed as a single agent with its own probabilistic beliefs, where these beliefs are obtained by aggregating those of the group members. In a celebrated contribution, Russell et al. (Philosophical Studies, 2015) apply the Bayesian paradigm to groups by requiring group beliefs to change via Bayes' rule whenever new information becomes publicly available, i.e., whenever the members' beliefs all change via Bayes' rule given that information. The Bayesian paradigm in fact suggests a stronger requirement: Bayes' rule should constrain group beliefs not just in the face of public information (learnt by all members), but also in the face of non-public information (learnt by only some members), including private information (learnt by just one member). I propose a taxonomy of types or degrees of group Bayesianism. Each type requires group beliefs to obey Bayes' rule for a certain type of information. The types of information (hence, of group Bayesianism) are obtained by differentiating according to (i) how widely information is accessible in the group, and (ii) whether of not information is representable within the algebra where credences are held. Six theorems will establish how exactly (and whether) group credences can obey any given type of group Bayesianism. As it turns out, group credences must be formed using 'weighted geometric averages' of individual credences, with certain constraints (on the weights) that depend on the kind of group Bayesianism. One of these theorems -- the one concerned with public and representable information -- is essentially Russell et al.'s central result (with some necessary minor qualifications).
Jan Christoph Schlegel,
Location: Seminar Room C-1.03, Tongersestraat 53, Maastricht (directions)
Host: Burak Can (Maastricht University)
|10:30-11:10||Battal Dogan (University of Lausanne) |
Object Allocation via Immediate Acceptance
|11:10-11:50||Jan Christoph Schlegel (University of Lausanne) |
Virtual Demand and Stable Mechanisms
|11:50-12:30||Utku Ünver (Boston College) |
Multi-Donor Organ Exchange
|12:30-14:30||Lunch at Ginger|
|14:30-15:10||Shashwat Khare (Maastricht University) |
Stability in Matching with Couples Having Non-Responsive Preferences
|15:10-15:50||Péter Biró (Hungarian Academy of Sciences) |
Integer Programming Methods for Special College Admissions Problems
|16:20-17:00||Bertan Turhan (Instituto Technologico Autonomo de Mexico) |
Dynamic Reserves in Matching Markets
|17:00-17:40||Burak Can (Maastricht University) |
Comparing Matchings: Metrics on Matching Markets
|18:00-20:00||Dinner at Dadawan|
Harrie de Swart
Location: Room C1-4, Theil Building, Campus Woudestein, Erasmus University Rotterdam (directions)
Host: Constanze Binder (Erasmus University Rotterdam)
|14:00-15:00||Harrie de Swart (Erasmus University Rotterdam) |
Majority Rule Does Not Respect Domination!
Abstract: I will present a recent paper by Michel Balinski and Rida Laraki, called Majority Measures. Almost everyone seems to be convinced that using the Majority Rule to choose one of two candidates is infallible and much of the literature takes Condorcet consistency as a most desirable property. Donald Saari has already criticized the latter by pointing out that the Condorcet winner may change if a Condorcet portion is added to or subtracted from the given profile, while intuitively such a portion should not change the outcome. While in the traditional framework voters are supposed to express their opinion by giving a ranking of the candidates, Balinski and Laraki argue that it is natural and more informative that voters express their opinion by evaluating each candidate in a common scale of grades. They show that Majority voting between two candidates fails when the winner is strongly rejected by the rest of the electorate whereas the loser is consensual. More precisely, in such cases the Majority Rule does not respect that the grades of the Majority loser may dominate the grades of the Majority winner. Candidate A should clearly defeat B when A's grades dominate B's. They show that a method of (social) ranking respects domination for three candidates or more when it satisfies May's axioms, but now with measures instead of comparisons as input, together with Transitivity and IIA. Any point-summing method satisfies these axioms and so does Majority Judgment, but Majority Rule does not. Only in polarized electorates the Majority Rule is incontestable and Majority Judgment agrees with it.
|15:15-16:15|| Antoinette Baujard (University of Lyon) |
How Voters Use Grade Scales in Evaluative Voting
Abstract: During the first round of the 2012 French presidential election, participants in an in situ experiment were invited to vote under "evaluative voting", which involves rating the ten candidates using different numerical scales: (0,1), (-1,0,1), (0,1,2), and (0,1, ... , 20). The paper studies scale calibration effects, i.e., how individual voters adapt to the offered scale, leading to possibly different election outcomes. The data show that the scales are not linearly equivalent, even if individual ordinal preferences are not inconsistent. Scale matters, notably because of the symbolic power of negative grades, which do not affect all candidates uniformly. The paper concludes by discussing the optimal choice of the scale. Joint work with Frédéric Gavrel, Herrade Igersheim, Jean-François Laslier, and Isabelle Lebon.
|16:30-17:30||Markus Brill (University of Oxford) |
Justified Representation in Approval-based Committee Voting
Abstract: We consider approval-based committee voting, i.e., the setting where each voter approves a subset of candidates, and these votes are then used to select a fixed-size set of winners (committee). We propose a natural axiom for this setting, which we call justified representation (JR). This axiom requires that if a large enough group of voters exhibits agreement by supporting the same candidate, then at least one voter in this group has an approved candidate in the winning committee. We show that for every list of ballots it is possible to select a committee that provides JR. However, it turns out that several prominent approval-based voting rules may fail to output such a committee. In particular, while Proportional Approval Voting (PAV) always outputs a committee that provides JR, Reweighted Approval Voting (RAV), which can be seen as a tractable approximation to PAV, does not have this property. We then introduce a stronger version of the JR axiom, which we call extended justified representation (EJR), and show that PAV satisfies EJR, while other rules we consider do not; indeed, EJR can be used to characterize PAV within the class of weighted PAV rules. We also consider several other questions related to JR and EJR, including the relationship between JR/EJR and core stability, and the complexity of the associated algorithmic problems. Joint work with Haris Aziz, Vincent Conitzer, Edith Elkind, Rupert Freeman, and Toby Walsh.
|14:00-14:45||Hannu Nurmi (University of Turku) |
Two Types of Participation Failure under Nine Voting Methods in Variable Electorates
Abstract: We study the susceptibility of nine voting procedures to two types of no-show paradoxes. These are denoted as P-TOP and P-BOT paradoxes. In the former it is possible that candidate x is elected by a given electorate and then, ceteris paribus, another candidate, y, is elected once additional voters ranking x at the top of their preference ranking join in. P-BOT paradox refers to the possibility that candidate y, who has not been elected by a given electorate, is elected if additional voters join the electorate who, ceteris paribus, rank y at the bottom of their ranking. This is joint work with Dan Felsenthal.
|14:45-15:30|| Ali Ihsan Ozkes (Aix-Marseille School of Economics) |
The Condorcet Jury Theorem under Cognitive Hierarchies: Theory and Experiments
Abstract: This paper introduces an endogenous cognitive hierarchy model in which players are assumed to best-reply holding heterogeneous beliefs on the other players' cognitive levels. Contrary to the previous models, players are allowed to consider presence of opponents at their own cognitive level. This extension is shown to eradicate the incompatibility of standard cognitive hierarchy models in the games where the best-reply function is an expansion mapping. We employ the model to explain asymptotic voting behavior in information aggregation problems of the Condorcet Jury Theorem. Behavioral assumption of the strategic thinking turns out to be a crucial factor in whether the asymptotic efficiency is obtained or not. We conducted laboratory experiments and obtained evidences that the endogenous cognitive hierarchy model provides significant improvements upon standard cognitive hierarchy models and symmetric Bayesian Nash equilibrium in explaining observed behavior of voters.
|16:00-16:45||Daniele Porello (CNR Trento) |
Reason-based Preferences and Deliberation
Abstract: Reason-based views of preferences and choices have been recently proposed in social choice theory, argumentation theory, and multiagent systems. After surveying a number of proposals for modelling reason-based preferences and discussing their compatibility with the tenets of deliberation, I introduce a cognitively inspired methodology (conceptual spaces) to represent reason-based preferences. This methodology allows for coping with the case of partial information as well as with the case of disagreement among agents about the salient reasons. I discuss the case of individual preferences, then I approach the case of collectively sharable reasons of social preferences.
|17:00||Drinks in a nearby café|
|13:00-13:45||Abhinaba Lahir (Maastricht University) |
Locating Two Public Bads in an Interval
Abstract: This paper studies the problem of locating two noxious facilities (garbage dumping sites for example) for agents with single-dipped preferences. A decision rule selects two points on the segment [0,1] for the public bads for every profile of reported preferences.
|13:45-14:30||Swarnendu Chatterjee (Maastricht University) |
Locating a Public Good on a Sphere
Abstract: It is shown that in a model where agents have single-peaked preferences on the sphere, every Pareto optimal social choice function that is strict or coalitional strategy-proof, is dictatorial.
|15:00-16:00||Ana Mauleon (Université Saint-Louis, Brussels) |
Constitutions and Social Networks
Abstract: The objective of the paper is to analyze the formation of social networks where individuals are allowed to engage in several groups at the same time. These group structures are interpreted here as social networks. Each group is supposed to have specific rules or constitutions governing which members may join or leave it. Given these constitutions, we consider a social network to be stable if no group is modified any more. We provide requirements on constitutions and players' preferences under which stable social networks are induced for sure. Furthermore, by embedding many-to-many matchings into our setting, we apply our model to job markets with labor unions. We find a variation of Roth’s "polarization of interests" (cf. Roth, 1984) between employers and employees.
|16:00-17:00||Nick Baigent (London School of Economics) |
Understanding Sen's Independent Decisiveness
Abstract: This paper offers an exposition of Sen's Independent Decisiveness property. It is widely thought to be one of the most difficult properties in social choice theory. Yet, the Arrow-type impossibility theorem in which it is the key property, may be one of the most important in the whole of social choice theory because it imposes no properties on group choices. In particular, no so-called "consistency properties" are required. The focus is both on a formal understanding of Independent Decisiveness and on possible justifications.
|17:30-19:00||Cocktail Ad Fundum|
Note: The talk by Nick Baigent is organised as a joint session with the GSBE-ETBC Seminar.