Cloud or On-Premise? A Strategic View of Large Language Model Deployment
Loading...
Files
Date
Contributor
Advisor
Editor
Performer
Department
Instructor
Depositor
Speaker
Researcher
Consultant
Interviewer
Interviewee
Narrator
Transcriber
Annotator
Journal Title
Journal ISSN
Volume Title
Publisher
Journal Name
Volume
Number/Issue
Starting Page
1010
Ending Page
Alternative Title
Abstract
Large language models (LLMs) have advanced rapidly in recent years. We examine a critical decision faced by an LLM provider: whether to provide a local (on-premise) service channel in addition to cloud services. We develop a game-theoretical queueing model to analyze the economic and welfare implications of introducing an on-premise model. Our results show that offering the localization option can reduce the provider's optimal profit due to market cannibalization, yet increase users' overall surplus. Such market outcomes can be reinforced by users' privacy concerns, but may reverse when users differ significantly in their service valuations, as localization enables the provider to extract users' surplus more effectively. When localization is offered through a third party, price discrimination can further increase surplus extraction; however, the double marginalization along the AI supply chain may offset these gains. Finally, in competitive markets, localization may prompt an entrant to lower the quality of their cloud services to limit cannibalization, thereby softening price competition with the incumbent to some extent. Overall, our analysis highlights the strategic trade-offs in LLM deployment and provides guidance on pricing and localization decisions.
Description
Citation
DOI
Extent
10 pages
Format
Type
Conference Paper
Geographic Location
Time Period
Related To
Proceedings of the 59th Hawaii International Conference on System Sciences
Related To (URI)
Table of Contents
Rights
Attribution-NonCommercial-NoDerivatives 4.0 International
Rights Holder
Catalog Record
Local Contexts
Email libraryada-l@lists.hawaii.edu if you need this content in ADA-compliant format.
