Practical Concerns for Automated Mediation

By MyeongHwan Cha

Although traditional mediations involve a human mediator, automated mediation not only provides a platform for virtual mediations and enhances communication of information, but also actively facilitates disputes through the use of artificially intelligent (“AI”) systems.[1]  Such systems are capable of identifying interests and goals, refining preferences, calculating tradeoffs, and creating potential solution packages.[2]  However, aside from the ethical concerns of automated mediation, there are various practical issues that require attention.  Automated mediation systems, like Smartsettle, raise such issues, specifically concerns of creativity, care, and confusion.

Smartsettle is a negotiation support system that “elicits and manages preferences for any number of parties . . . and generates potential agreements based on party preferences.”[3]  The platform allows parties to privately view established issues between them and enables them to assign preferences for each.[4]  These preferences can be assigned through a visual graph that corresponds with the parties’ levels of satisfaction.[5]  Parties can then create and exchange their own resolution packages, which, essentially, compile a bundle of proposals for all of the issues involved.[6]  Each resolution package can reflect party preference by using labels such as unacceptable, fair, conciliatory, or optimistic.[7]  In exchanging proposals, the Smartsettle program is also able to generate and suggest alternative resolution packages, based on those created by the parties.[8]  This visual blind bidding system generates proposals in between the “zone of agreement” and eliminates the tedious back and forth that is often a part of mediations and negotiations.[9]  Visual blind bidding also allows participants or the mediator to make anonymous proposals disguised as a suggestion from the Smartsettle algorithm.[10]  This is the essence of the system, but it is not all that Smartsettle has to offer.[11]

The main benefits of automated mediation are cost savings and convenience, especially in comparison to litigation.[12]  Other benefits are the potential for greater access to justice[13] and, possibly, the ability to reach better results.[14]  However, concerns about the use of automated mediation seem to undercut these proposed benefits.

First, there is an issue of creativity.  Although participants may feel more satisfied with automated mediation programs,[15] these platforms are restricted in the types of issues that can be addressed and solutions that can be reached.[16]  Although systems like Smartsettle may be intuitive for monetary issues, or issues that can be assigned some form of numerical value, cases that involve more emotional or moral disputes are difficult to quantify.[17]  Moreover, participants may not be interested in anything numerical or quantitative.  A significant benefit of the traditional mediation process is the ability to allow parties to reach creative or unconventional solutions that may be unavailable in a court.[18]  Resolutions facilitated by a human mediator can involve apologies, future approaches to communication, an accounting for past wrongs, and a refrain from certain contact or conduct.[19]

Second, there is an issue of human care.  A key part of mediation is providing an environment where participants can feel heard and understood.[20]  However, an automated mediation system cannot truly listen, nor can it consider personalities or physical and verbal cues.[21]  Participants may even feel isolated and believe that they are not worthy of human attention.[22]

Finally, there is an issue of confusion, specifically regarding accessibility and AI literacy.  People may simply lack access to technology or the Internet.[23]  Additionally, the software may not be available in certain languages, which excludes certain groups from participating, or there may be risks of miscommunication between participants of different languages and cultures.[24]  Regarding AI literacy, participants can be confronted with an intimidating platform and a learning curve that is not comparable to their daily usage of the Internet and other technologies.[25]  This may create an issue of power imbalances, with those more comfortable with the platform being at an advantage.[26]

However, as more complex algorithms become possible and as future generations further rely on technology, there is hope that automated mediations may play a bigger role in our society.  Research is now being conducted to program algorithms to address the above-mentioned concerns,[27] and robots are even increasingly being used for a variety of complex roles in our rapidly changing world.[28]  Nevertheless, further development and future use of artificial intelligence in mediation must be reviewed carefully to truly deliver the full benefits that mediation can offer.
____________________________

[1] Ayelet Sela, Can Computers Be Fair? How Automated and Human-Powered Online Dispute Resolution Affect Procedural Justice in Mediation and Arbitration, 33 Ohio State J. Disp. Resol. 91, 100 (2018).

[2] Id.

[3] About, Smartsettle, https://www.smartsettle.com/about-us [https://perma.cc/8HSN-693W] (last visited Jan. 28, 2022).

[4] Ayelet Sela, The Effect of Online Technologies on Dispute Resolution System Design: Antecedents, Current Trends, and Future Directions, 21 Lewis & Clark L. Rev. 635, 663 (2017); Davide Carneiro et al., Online Dispute Resolution: An Artificial Intelligence Perspective, 41 A.I. Rev. 211, 228 (2014).

[5] Sela, supra note 4.

[6] Id.

[7] ENegotiation by Smartsettle, Smartsettle Infinity Made Simple, YouTube (Oct. 31, 2018), https://www.youtube.com/watch?v=RroY5fqAevs [https://perma.cc/B298-LJEX].

[8] Sela, supra note 4; see also Carneiro, supra note 4.

[9] Ernest Thiessen & Graham Ross, Live Demonstration of a Working Collaborative eNegotiation System (Smartsettle Infinity), 18 Int’l Conf. A.I. & L. 50, 275 (2021).

[10] Id.

[11] See id. at 276 (Automatic deal-closer closes small gaps to avoid impasse, maximizes the minimum gain, uncovers hidden value, generates an improvement that distributes those values fairly, and has fairness-enhancing normalization that distributes additional benefits fairly.); see also David Allen Larson, “Brother, Can You Spare a Dime?” Technology Can Reduce Dispute Resolution Costs When Times Are Tough and Improve Outcomes, 11 Nev. L. J. 523, 539–40 (2011) (stating that an online chat function is available, an arbitration option is available, and that Smartsettle rewards the party that makes the smallest final move, while also recognizing the more generous party).

[12] Joseph W. Goodman, The Pros and Cons of Online Dispute Resolution: An Assessment of Cyber-Mediation Websites, 2 Duke L. & Tech. Rev. 1, 7–8 (2003); Larson, supra note 11, at 541.

[13] See Anjanette H. Raymond & Scott J. Shackelford, Technology, Ethics, and Access to Justice: Should an Algorithm be Deciding Your Case?, 35 Mich. J. Int’l L. 485, 491–92 (2014) (observing that ODR platforms can mitigate backlog and potentially motivate consumers to seek redress, thereby increasing access to justice).

[14] See Sela, supra note 1, at 132–33 (finding that principal, or automated, mediations reported greater satisfaction compared to instrumental mediations, or mediations with a human mediator).

[15] Id.

[16] See Raymond & Shackelford, supra note 13, at 517 (stating that the use of algorithms for cases such as child custody and discrimination is a step too far for many individuals).

[17] See Goodman, supra note 12, at 10 (stating that automated mediation programs can only handle disputes where the amount of the settlement is the issue).

[18] Carrie J. Menkel-Meadow et al., Mediation Practice, Policy & Ethics 66–67 (3rd ed. 2020).

[19] Id.

[20] Goodman, supra note 12, at 10–11; see Menkel-Meadow et al., supra note 18, at 163 (stating that a key task of a mediator is to enable communication, expression, and encourage parties to listen to each other).

[21] Goodman, supra note 12, at 11.

[22] Larson, supra note 11, at 556.

[23] Goodman, supra note 12, at 12.

[24] Larson, supra note 11, at 547.

[25] Id. at 542.

[26] Id. at 545.

[27] Id. at 550–51 (stating that computer scientists are investigating the use of interactive relation agents that can engage users in a relationship building dialogue and that researchers are also exploring the possibilities of psychological understanding in a robotic agent).

[28] Id. at 551–54 (observing that robots are being utilized in classrooms and health care facilities).

MyeongHwan Cha

The author is a 2L student at Cardozo School of Law and serves as a Staff Editor for Volume 23 of the Cardozo Journal of Conflict Resolution.

Previous
Previous

Conflict and Compromise Over U.S.-E.U. Data Flows

Next
Next

The Diversity Issue in Arbitration