Mobile devices, together with their users, are constantly moving from one situation to another. To adapt applications to these changing contexts, the devices must have ways to recognize the contexts. There are various sources for context information: sensors, tags, positioning systems, to name a few. The raw signals from these sources are translated into higher-level interpretations of the situation. Unfortunately, such data is often unreliable and constantly changing. We seek to improve the reliability of context recognition through an analogy to human behavior. Where multiple devices are around, they can jointly negotiate on a suitable context and behave accordingly. This approach is becoming particularly attractive with the multitude of personal devices on the market. We present a collaborative context determination scheme, suggest examples of potential applications of such collaborative behavior, and raise issues of context recognition, context communication, and network requirements.