Question:
Concurrency in reading and writing files?
ham55464
2011-09-21 01:38:06 UTC
I have a website and part of my website needs to read/write a plain text file on server. most of the time we have between 50 to 100 concurrent online users and their actions on our site result to read/write certain file in which that is a temporary place holder for some other pages.

So I want to know about concurrent read/write conflicts and what's the best practice to avoid that if that happened. How does my server manage these requests?
Three answers:
2011-09-21 01:52:43 UTC
A frequent approach is to use "two phase protocol" or "2PL". It works a bit like this:

(NB this is simplified to give you the idea of what's occurring)

User A reads record 1

User B reads record 1

User B attempts an edit so the record is locked for editing by other users

User A attempts an edit and waits

User B finishes editing record 1 is now record 1a

Record 1a is released and the system causes User A to re-read the record so the use sees record 1a

If User A still wants to edit then the record is locked while user A edits the record.



E.g. In a seat booking system User A may find that the seat wanted has been taken by User B so no longer wants to edit the record.
keniston
2016-11-09 14:56:59 UTC
flat record storage have those risks. a million: it fairly is could be injury particularly if some physique attempt to open in notepad and regulate it. 2: Its seek archives very sluggish. 3: in the adventure that your application working and your equipment unexpectedly restart your database would be injury. etc risks.
csicky
2011-09-21 01:52:59 UTC
You did not mentioned on what server environment (PHP, Asp.Net, etc...)



Anyway, you should consider moving from plain text to a database.


This content was originally posted on Y! Answers, a Q&A website that shut down in 2021.
Loading...