eQSL.cc Forum
Help!  eQSL.cc Home  Forums Home  Search  Login 
»Forums Index »Suggestions »General web site suggestions »Any one running into duplicate entries on uploads?
Author Topic: Any one running into duplicate entries on uploads? (8 messages, Page 1 of 1)

VE3FWF Bernie Murphy
Posts: 110
Joined: May 22, 2007




Posted: Jan 23, 2010 04:43 PM          Msg. 1 of 8
I've just uploaded about 4500 QSLs and I have noticed that there several QSO duplicates in the database. While this in general is not a big issue, it perhaps wastes disk space on the server.

A tool to check for duplicates would be usefull.

Maybe the DB admin runs a cleanup once in a while to remove the dups, I do not know?

73, Bernie, VE3FWF, Ottawa, Canada

VE3FWF Bernie Murphy

RD3AJB Mikhail Nosov
Posts: 33
Joined: Dec 9, 2008



Posted: Jan 25, 2010 06:47 AM          Msg. 2 of 8
Hi! I asked the same theme question in the English Support. This problem lasts several monthes at least...

RD3AJB Mikhail Nosov

IK1HGI ANTONIO MUSUMECI
Posts: 5
Joined: Aug 5, 2000



Posted: Feb 12, 2010 04:38 PM          Msg. 3 of 8
Dear Sr.
È possibile scaricare il registro eQSL andando a "log / uscita" e, in fondo, troverete un link per scaricare Entra in formato ADIF
73
CT1EKD
Il supporto di volontari

Caro Pedro CT1EKD grazie per aver risposto alla mia richiesta
Ho fatto come lei prende tutto ok, grazie, cedo il file di registro
Log ADIF è stato costruito non ci sono stati 3.138 i record

la cosa che non riesco a capire perché mi trovo con tanti QSO doppio?
infatti ancora il mio verifica 2000-2001-2002 - 2003-2004 mi trovo ancora una duplicazione?
Oggi ho voluto vedere passo dopo passo il mio log del numero record di 3.138
elimi****one di duplicati nel mio controllo il mio record è 2506!
Ora mi riserva per il futuro!
Contribuisci uno migliore traduzione una
[firma] IK1HGI ANTONIO MUSUMECI [/ firma]
Edited by IK1HGI ANTONIO MUSUMECI on Feb 12, 2010 at 04:38 PM

VE3OIJ P. Darin Cowan
Posts: 186
Joined: Jul 9, 2006


Posted: Feb 13, 2010 01:17 PM          Msg. 4 of 8
If it was still 1985, some wasted bits in the database would be a problem because disk space was expensive. Today, with a terabyte of space costing ~$200, some duplicate records don't really matter much in terms of space.

Over time, they may affect performance, however, if the number of duplicates ever becomes a significant portion of the overall database.

I wouldn't worry about it too much.

VE3OIJ P. Darin Cowan

OK1FMY Milosh Soukup
Posts: 10
Joined: Aug 27, 2007




Posted: Sep 21, 2010 06:58 AM          Msg. 5 of 8
We have already the tool for dupes. Our members should be instructed to check their Inbox first before uploading new eqsls. It would prevent situation when a dupe is created by bidirectional efforts.

OK1FMY Milosh Soukup

OK1FMY Milosh Soukup
Posts: 10
Joined: Aug 27, 2007




Posted: Sep 21, 2010 06:58 AM          Msg. 6 of 8
We have already the tool for dupes. Our members should be instructed to check their Inbox first before uploading new eqsls. It would prevent situation when a dupe is created by bidirectional efforts.

OK1FMY Milosh Soukup

KI4SYE Terry D. Waters
Posts: 1
Joined: Feb 23, 2007



Posted: May 6, 2012 05:02 PM          Msg. 7 of 8
My problem is that every time I upload my ADIF the same duplicates are generated and I have to go back and delete them one at a time.

How can I upload only the records in a particular date range so only the new records upload?

KI4SYE Terry D. Waters

VE3FWF Bernie Murphy
Posts: 110
Joined: May 22, 2007




Posted: May 6, 2012 11:56 PM          Msg. 8 of 8
Terry:

What logging program are you using? It appears that you are uploadling all your QSOs?

Programs such as HRD have the ability to only upload specific QSOs that you select.

A good utility to use is ADIFPROC which will scrub your ADIF file and remove duplicates (if there are any present).


VE3FWF Bernie Murphy
Edited by VE3FWF Bernie Murphy on Sep 4, 2012 at 02:23 AM