database - Storage of many log files -


I have a system that is receiving log file from different locations via HTTP (> 10k producer, copy Day 10 logs, ~ 100 lines of text each).

I would like to store them in order to be able to compile them.

My question is, what is the best way to store them?

  • Flat text files (with proper locking), one file uploading per file, one directory per day / manufacturer
  • Flat text files, all producers One (large) file per day (Here the problem will be indexing and locking)
  • Database tables with text (MySQL is preferred for internal reasons) (DB as PB removal with net Can be too long!)
  • Database table with a record of
  • Allow the database (with a table per day), simple data split (this is the partition), although the version of mysql (ie internally supported) Does not support it)
  • Document based DB La Lauchoubb or mangodeb (Problem indexing / can be with maturation / speed

Any advice?

I want to choose multi The first solution.

I do not know why you need DB. It seems that you only have to scan through the data. Keep the log in the most "raw" position, then process it and then create a tarball for each day

Overall the only reason is to reduce the number of files. On some file systems, if you put more than N files in a directory, then performance decreases rapidly. Check your file system and if this is the case, then arrange a simple 2-level hierarchy, say, the first level directory name using the first 2 digits of the Creator ID


Comments