<!DOCTYPE html PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html>
<head>
<meta content="text/html;charset=ISO-8859-1" http-equiv="Content-Type">
</head>
<body bgcolor="#ffffff" text="#000000">
<pre wrap="">"sleep for a couple of seconds"
But then you could only insert 1800 cdrs per hour...
If I was to insert 36000 cdrs per hour this means that I have to
open
parse
close
10 files per second. Imagine the I/O penalty just for opening - closing the file.
(the persing is the same for both situations)
</pre>
<br>
<br>
David Knell wrote:
<blockquote cite="mid:49094B5E.3070203@3c.co.uk" type="cite">
<pre wrap=""><a class="moz-txt-link-abbreviated" href="mailto:regs@kinetix.gr">regs@kinetix.gr</a> wrote:
</pre>
<blockquote type="cite">
<pre wrap="">Yes, the xml files give you tons of info... but isn't it a little
insufficient - performance wise -
to open and close so many files in such a little time. In a PBX
environment that wouldn't be an
issue but if we get to the small-voip-carrier level (some thousand cdrs
per hour)
that could slow things down considerably, wouldn't it?
</pre>
</blockquote>
<pre wrap=""><!---->Not that you'd notice. We run XML CDR to database scripting on each box
that we use
for switching, and it's a pretty trivial task compared with switching
all that media. Doing it
this way is:-
(a) distributed - one process per box scales nicely;
(b) robust - script down, DB down, no problem: files just queue up;
(c) simple - the script logic is trivial:
- while 1
- for each file in the XML CDR directory
- open it
- parse it (XML::Simple for us)
- insert it in to the DB
- delete it
- sleep for a couple of seconds
Two error cases: can't parse or can't find data which should be there:
move the file in to
another directory to be examined by real eyes; DB insert fails: break
out of inner loop and
it'll be retried after a short pause.
--Dave
</pre>
</blockquote>
<br>
</body>
</html>