Caching – How to Delete Cache Files When the File Cache Module Is Installed

Hellow
I have a site that someone else has created and I receive it to manage it.
Having configured the file cache module where the folder that stores the cache is located
data has a size of 30 gigabytes.
The site is under Linux RedHat 6.5 and Drupal 7.43. When you try to clean the cache by the Drupal Administration System (Administrator / Development / Rendimiento – empty all caches) or with Drush (drush cc all), the site drops and you have to mount a backup of the structure of the folders and directories to restore the site.

How to delete cache files when the file cache module is installed, to free up disk space?

thank you for your collaboration

c ++ – why does the command prompt not display cpp files saved in Desktop or Documents?

enter the description of the image here

My problem is that I have cpp files on the desktop and documents. I wrote a c ++ code with the help of Notepad ++ and I tried to build it via the command prompt. But when I try to access the directory where I saved my file, I do not see the cpp file of the notebook that I saved to generate the code.

I have downloaded the mingw compiler and added it to the environment variables. Do you have any idea why these files do not show up in some directories? Because if I save it in the music directory, I can access it.

Glacier recovery of purged files

I want to recover deleted files from my network folders from Amazon Glacier via Cloudberry. I do not know the exact number of deleted files and I wondered if it was possible to know through Cloudberry.

sharepoint online – Is it possible to make PDF files hosted in a non-downloadable document library

Battery Exchange Network

The Stack Exchange network includes 175 question-and-answer communities, including Stack Overflow, the largest and most reliable online community on which developers can learn, share knowledge and build their careers.

Visit Stack Exchange

Search csv files in c ++

ps this question only has a part of my code

I take several user entries and store them in a csv file.
int NewEntry () {
cin.ignore ();
string name, phno, id, date;
int Ecost, advance, due;
ofstream file ("data.txt", ios :: in | ios :: out | ios :: app);
if (file) {
cost << "Customer name: -";
getline (cin, name);
cost << "Identity card or Aadhar: -";
getline (cin, id);
cost << "Telephone number: -";
getline (cin, phno);
cost << "Date of the day (dd / mm / yy): -";
getline (cin, date);
cost <<"Estimated Cost:-";
    cin>> Ecost;
cost <<"Advance:-";
    cin>> advance;
due = (Ecost-advance);
file << name <<,, & # 39;
<< id << & # 39;, & # 39;
<< phno << & # 39;, & # 39;
<< date <<,, & # 39;
<< Ecost << & # 39;, & # 39;
<< forward <<,, & # 39;
<< due
<< " n";
system ("cls");
cost << endl << "Entry created.";
file.close ();
}

the data looks like this

Sumeet, 11802638,8005958881,09-03-19,2000,1000,1000 // line ends here
Sagar, 11802637,7788830892,13-09-19,1200,200,1000
Rohit, 11802639.9998887776,10-09-19,2500,500,2000

then I apply search by name, ID or phone number using this part of the code

getline (cin, names);
while (getline (file, line)) {
stringstream ss (line);
getline (ss, name, & # 39;)
getline (ss, id, & # 39;);
getline (ss, phno, & # 39;);
getline (ss, date, & # 39;);
getline (ss, ecost, & # 39;)
getline (ss, advance, ', & # 39;);
getline (ss, due, & nbsp;  n);
if (names == name) {
cout << "Name: -" << name << " nID: -" << id << " nPhone number: -" << phno << " nDate of entry: -" << date << " nEstimated cost: -" << ecost << " nAdvance paid: -" << advance << " nNumber: -" << due << endl;

}

likewise apply the search by ID and number
but the problem is that my programs are only looking in the first line, if I look for a name or a second-line identifier, it does not work.
where, as if I printed all the data in a csv file, it worked perfectly.

    void AllDisplay () {
string date, id, name, name, phno, ecost, advance, due;
rope line;
ifstream file ("data.txt");
while (getline (file, line)) {
stringstream ss (line);
getline (ss, name, & # 39;)
getline (ss, id, & # 39;);
getline (ss, phno, & # 39;);
getline (ss, date, & # 39;);
getline (ss, ecost, & # 39;)
getline (ss, advance, ', & # 39;);
getline (ss, due, & nbsp;  n);
cout << "Name: -" << name << " nID: -" << id << " nPhone number: -" << phno << " nDate of entry: -" << date << " nEstimated cost: -" << ecost << " nAdvance paid: -" << advance << " nNumber: -" << due << endl;
cost << "-------------------------------------------" << endl;

}

SharePoint Online – Errors were detected while compiling the workflow. Workflow files have been saved but can not be executed.

For a week, we've been experiencing this error when we try to publish new workflows or existing feeds.

Errors were detected while compiling the workflow. Workflow files have been saved but can not be run.

Someone faced with this problem?

How to export .CSV files without commas in empty cells? Google leaves

Here is my problem, I have a form converter that simply rearranges columns. I use a script to paste the values ​​only into another sheet in order to export it and upload it to another website.

However, when I export, it appears that stacks of commas are saved after the filled cells. As far as I know, these cells are and must be empty.

When I open in Excel and manually clear the contents of empty cells, the commas disappear.

My question is: how to export without saving these commas, what happens in the filled cells?

Export form The cells below must be completely empty

Preview of Windows, showing commas. should not be there

Here is my export script:

CopyPasteValues ​​function () {
var spreadsheet = SpreadsheetApp.getActive ();
spreadsheet.getRange (& # 39; O3: AS90 & # 39;). activate ();
spreadsheet.setActiveSheet (spreadsheet.getSheetByName (& # 39; Export & # 39;), true);
spreadsheet.getRange ('A2'). activate ();
spreadsheet.getRange ("Converter! O3: AS90"). copyTo (spreadsheet.getActiveRange (), SpreadsheetApp.CopyPasteType.PASTE_VALUES, false);
spreadsheet.getRange (& # 39; F27 & # 39;). activate ();
};

Up-4ever Earn money by sharing your files with a minimum payment: $ 1

Up-4ever Earn money by sharing your files with a minimum payment: $ 1

PROOF OF PAYMENT

https://www.up-4.net/?op=proof

Reference link

https://www.up-4.net/free580526.html

MongoDB Too many open files when reSync member

I've had the problem in many days past. I have MongoDB cluster with 3 fragments and 3 replicas.

                1 2 3
A O S S
B S P S
C P O P

P - Primary state
S - Secondary state
O - Other State

Alphabets are machine replicas and numerics are fragments.

I'm trying to resynchronize all the data around 2 TB in mongod machine with other state (A1 and C2). But I had an error while resynchronizing the data in this main mongod service crashed because Too many open files

2019-03-16T16: 35: 22.351 + 0000 E -        [conn28204] can not open / dev / urandom Too many open files in the system
2019-03-16T16: 35: 22.362 + 0000 I NETWORK  [listener] Error accepting new connection on 0.0.0.0:27017: Too many open files in the system
2019-03-16T16: 35: 22.362 + 0000 I NETWORK  [listener] Error accepting new connection on 0.0.0.0:27017: Too many open files in the system
2019-03-16T16: 35: 22.362 + 0000 I NETWORK  [listener] Error accepting new connection on 0.0.0.0:27017: Too many open files in the system

I've already tried to fix ulimit as a mongodb recommendation.

> ulimit -a

size of the base file (blocks, -c) 0
data segment size (kilobytes, -d) unlimited
planning priority (-e) 0
unlimited file size (blocks, -f)
pending signals (-i) 241518
max memory locked (kilobytes, -l) 64
maximum memory size (kilobytes, -m) unlimited
open files (-n) 64000
pipe size (512 bytes, -p) 8
POSIX message wait queues (bytes, -q) 819200
priority in real time (-r) 0
stack size (kilobytes, -s) 8192
CPU time (seconds, -t) unlimited
max user process (-u) 241518
virtual memory (kilobytes, -v) unlimited
Unlimited file locks (-x)

and defined in /etc/security/limits.conf like this

* flexible nofile 64000
* hard nofile 64000
soft root nofile 64000
hard root nofile 64000

but all this does not solve my problem. I always get Mongod service down because Too many open files. I stuck for 3 days. Someone has ideas or solutions?

Nginx proxy cache 404 static files not found (css, js, jpg, png vb)

My proxy server Nginx: 10.90.100.2
My back-end server: 10.90.100.3

my proxy server does not load static files (css, js, woff, png vb.)

My conf





proxy_cache_path / etc / nginx / proxy_cache levels = 1: 2 keys_zone = ferditest: 10m inactive = 60m;
proxy_cache_key "$ scheme $ request_method $ host $ request_uri";

server {

listen to 80;
listen to 443 ssl;
server_name www.abc.com abc.com;

ssl_certificate /etc/letsencrypt/live/abc.com/fullchain.pem;
ssl_certificate_key /etc/letsencrypt/live/abc.com/privkey.pem;




location ~ * . (jpg | jpeg | png | gif | ico | css | js | pdf | woff | woff2 | pdf) $ {
expires 30d;
}



#include /etc/nginx/bots.d/ddos.conf;
#include /etc/nginx/bots.d/blockbots.conf;
#include /etc/nginx/bots.d/blacklist-ips.conf;


ssl_ciphers HIGH :! aNULL :! MD5;
ssl_prefer_server_ciphers on;
shared ssl_session_cache: SSL: 1m;
ssl_session_timeout 5m;
ssl_protocols TLSv1 TLSv1.1 TLSv1.2; # do not use SSLv3 because of deadly poodles


the speed of pages on;
pagespeed FileCachePath "/ etc / nginx / ngx_pagespeed /";
pagespeed EnableFilters collapse_whitespace;
pagespeed EnableFilters extend_cache;
pagespeed EnableFilters make_google_analytics_async;
pagespeed EnableFilters lazyload_images;
pagespeed EnableFilters rewrite_images;

location ~ " .pagespeed .[a-z].)?[a-z]{2} .[^.]{ten}.[^.]+ "{
add_header "" "";
}
location ~ "^ / pagespeed_static /" {}
location ~ "^ / ngx_pagespeed_beacon $" {}



location / {


add_header Strict-Transport-Security "max-age = 31536000";
add_header X-Content-Type-Options nosniff;
add_header X-Cache $ upstream_cache_status;


proxy_cache ferditest;
add_header X-Proxy-Cache $ upstream_cache_status;
proxy_ignore_headers X-Accel -Excires Expires Cache-Control Set-Cookie;
proxy_set_header Accept-Encoding "gzip";

proxy_buffering on;
proxy_cache_valid 200 302 1m;
proxy_cache_valid 404 1m;
proxy_cache_methods GET HEAD;

wait time error proxy_cache_use_stale invalid_header update http_500 http_502 http_503 http_504;
proxy_cache_lock on;
proxy_cache_use_stale updated;
proxy_bind 0.0.0.0;


proxy_pass http://10.90.100.3;

proxy_set_header Host $ http_host;
proxy_set_header X-Forwarded-Host $ http_host;
proxy_set_header X-Real-IP $ remote_addr;
proxy_set_header X-Forwarded-For $ proxy_add_x_forwarded_for;
proxy_set_header X-Forwarded-Proto https;
proxy_set_header HTTPS "on";
proxy_set_header X-NginX-Proxy true;
proxy_set_header X-Accel-Internal / nginx-static-location-internal;





}
}

error.log


[error]    23151 # 23151: * 6 open () "/etc/nginx/html/test1/wp-includes/js/wp-embed.min.js" failed (2: no such file or directory), client:

Thank you