1
0
I'm trying to write in a single "file.cfg" the values of two variables generated by two independent scripts. The two variables are constantly updated and saved in the "file.cfg". Below is an example of my work.
example "file.cfg" content:
a=null
b=null
example "script_a.sh" update "a" value with:
#!/bin/bash
while : do
.............
val_a=1
sed -i "s/^\(a=\).*/\1$val_a/" file.cfg
.............
done
example "script_b.sh" update "b" value with:
#!/bin/bash
while : do
.............
val_b=2
sed -i "s/^\(b=\).*/\1$val_b/" file.cfg
.............
done
The scripts work perfectly and the values are updated. But if the two scripts are executed simultaneously one of the two values is not updated.
I discovered that sed with the "-i" option creates a temporary file that is overwritten by the two simultaneous operations. How can I solve?
Actually
– Xen2050 – 2019-02-23T03:52:26.673lockfile-create
will wait if the files's already locked, I'll edit in it's-r
option info... so the while loop's not even necessary if you're happy with it's default delay. It's described as "guaranteed compatible with Debian's file locking policies" so I think should perform equivalent to flock... @KamilMaciorowskiThe path/inode sed problem is interesting, but assuming both processes create a lockfile before starting sed, and clear the lockfile when sed's finished & the filename's "stable" I don't think there can be a conflict, since it creates an actual
file.cfg.lock
, so it's similar to your answer usingsome_lockfile
that needs cleaning (lockfile-remove
here) afterwards. – Xen2050 – 2019-02-23T03:54:07.297Those sound valid, I'll edit in a bit about "holding" the lock for longer, and an "exit on fail" for the code example – Xen2050 – 2019-02-23T10:28:24.027