solving a problem: trying to isolate the proper files

Please check the FAQ (https://www.xyplorer.com/faq.php) before posting a question...
Post Reply
socrates
Posts: 8
Joined: 22 Oct 2015 22:46

solving a problem: trying to isolate the proper files

Post by socrates »

I have a serious problem. I have AN idea how to resolve it. But I am unfamiliar with XYPlorer's command line syntax, and that seemed to be a way to generate what I might need.

I was hoping you can point me in the right direction, using your problem or one of the others I use.

Let me describe the problem. If you can point me in the right direction, or need more information, please write / ask.

There is no simple solution. At this point I am looking for the a) most accurate and fastest option.

I have gotten myself in a serious pickle . . . made worse, I am sure, because of stress and its attendant medical woes (which means I am not as focused as I should be—and usually am). Also made worse because there are some things I need to get done, but am afraid to do them (even email) since . . .

I have an elaborate system of multi-level—and presumably redundant—backups for:
1. Four computers:
a. Desktop (home)
b. Laptop I have been using (now at home)
c. Laptop I just bought [not an immediate issue]
d. Desktop (work) [not an immediate issue]

USING
2. Cloud backup via IDrive for a, b, and c [I define a backup set for each, but it should be more or less the same.]
3. Various localized backups, including
a. 2 external hard drives to backup on the desktop (1a)
b. 2 other external HD which I have taken to work with me to serve the same purpose . . . although they are now seriously out of date
c. Deep backups (once a week) to a separate thumb and external hard drive (stored in my safe)
d. [all powered by SecondCopy profiles I designed]

4. sync.com [cloud and local]
a. To move specialized backups run by scripts powered by WinRar command line scripts and filelists. I create specified backup files for other computers [files w/ archive bit set] . . . and more or less daily files of anything changed since the last one (usually the previous days.] The idea is that I take these archive files to the other computer, copy it 2 C:, and then extract “here” (move it to the corresponding folders on the other computer.]
b. An extra layer of backup for especially important files [user files (documents and AppData), ISO files, extra copies of program installation files, etc.]

In the past all this has served me extraordinary well. I have rarely lost an important document or program file. I can, usually without too much hassle, identify the latest version, and once I move it, say, to the desktop at home, the correct version propagates via IDrive to the cloud backup, via Second Copy to local backup, and via WinRar archives [run from a batch file].

I use TeraCopy to both copy and move files from folder to folder (same drive / different drive, of flash drive). TC is faster, verifies that the copy and move is successful, and has nice features [e.g., overwrite all / overwrite smaller (or newer) files].

Finally, I use xyplorer which gives me more control over metadata / properties (time, archive bit, etc.,), and lets me view the content of folders however I wish (list / details, etc.), and side-by-side.

The multi-level system has always served me very well . . . until now.

NONE of the key folders (especially userfiles (c:\users\hugh\) in any of the above seem to be the same. There is a difference in size (even if only 1%) or in the number of sub-folders and files—usually both. I think there are problems with the other "users" (default, all, public, etc.).

I have been messing with this for at least 10 hours. It seems the problem is getting worse, not better.

If I could ascertain one authoritative source of key folders files (documents, appData, etc.) [especially for hugh/default/all . . . etc.) I could (or should be able to) propagate it everywhere (although given that TeraCopy seems to be acting up, I am no longer confident).

I need to be getting some work done. But I am afraid to change any further files lest I make matters ever worse.

I have tried the tools I have and the tricks I know. All I can do is spot inconsistences between various versions of the some folders in some places.

Questions:
1) Ideas?
2) Any OTHER wonderful software of which I am unaware . . . or some feature of the above programs I am not exploiting?
3) The long-ass solution would take about ten hours:
a. Get a detailed multi-level file list of the eight sources above (perhaps using xylorer . . . <i>maybe</i> from the command line), with the following info
i. Name
ii. type
iii. Date created
iv. Date modified
v. Size
vi. Attributes
vii. checksums

THEN
b. export each to a excel spreadsheet
i. with a column for each of the above
ii. prefaced by a first column, which identified the source [desktop, desktop cloud, etc.,]
c. Then, since I do a custom sort
i. Use excel to eliminate duplicates in columns a) i-v
ii. Then sort the list of non-duplicates, to
1. Identify the latest version, including where it is [bi above]
2. Assume it is the latest and most complete files
d. Move it to the other computers / drives / folders . . . assuming those will then be updated during the corresponding cloud and localized backups.

ANY thoughts?

A shorter way, perhaps, although I am not sure how to do it accurately and completely, is just to compare folders on the above sources (size, number of sub-folders, #of files), isolate the key folders that are the problems, and then do the above described spreadsheet of the content of those folders. Then c) and d) above.

I though xyplorer on the command line, with a "print" to a .csv file might work.

I am stumped . . . and my efforts to date seem to be make matters worse.

Hugh

Hugh LaFollette <hughlafollette.com>
Professor of Philosophy and Cole Chair in Ethics, USF St. Petersburg &
5008 Malibu Dr.
Knoxville, TN 37918-4511
(865) 312-5551
(727) 608-7685
(727) 264-2448 (fax)
hughlafollette@gmail.com
Alt email: hhl@usfsp.edu

highend
Posts: 13274
Joined: 06 Feb 2011 00:33

Re: solving a problem: trying to isolate the proper files

Post by highend »

i. Name
ii. type
iii. Date created
iv. Date modified
v. Size
vi. Attributes
vii. checksums
While this is doable you'd need to get this info (because of the checksum) for each file (via scripting) / via a branch view and a custom column that calculates the checksum. Depending on the amount of data this can take... A very long time. A very very long time...
One way to shorten it, is to use the xxh32 checksum (multiple times faster than any other calculated via XY): viewtopic.php?f=7&t=19191&p=160568
I though xyplorer on the command line, with a "print" to a .csv file might work.
It wouldn't contain checksums so with no integrity check I wouldn't trust anything in this situation
One of my scripts helped you out? Please donate via Paypal

Dustydog
Posts: 321
Joined: 13 Jun 2016 04:19

Re: solving a problem: trying to isolate the proper files

Post by Dustydog »

After almost writing out some possible solutions, for your position and situation, I do have one strong suggestion: Image everything, then hire someone from either the IT or CS department at your school. Perhaps you could then get some useful work done on your newest laptop, enjoy the holidays, and save your frustration for when you're given a very small set of things to decide about at the end.

Sometimes very bright people take on too much outside of where they'd really rather be working.

Best of luck!

socrates
Posts: 8
Joined: 22 Oct 2015 22:46

Re: solving a problem: trying to isolate the proper files

Post by socrates »

Thanks. In the right environment, spot on, although probably pricier than I can handle moving in one retirement.

At my former University I had some people who were bright and I could trust (to do it right and not to pry). I like some of the folks here. Just don't know them enough to gauge their abilities and trustworthiness.

Moreover, in most universities, there were problems I could solve the IT people couldn't.

I think, though, I have a plan, albeit a relatively time consuming plan.

Have a great holiday.

Hugh

highend
Posts: 13274
Joined: 06 Feb 2011 00:33

Re: solving a problem: trying to isolate the proper files

Post by highend »

albeit a relatively time consuming plan
With the solution I've offered, you would have already solved that days ago :whistle:
One of my scripts helped you out? Please donate via Paypal

Post Reply