html – NGINX – Sites in LAN doesn’t access folders

In a LAN, I installed a NGINX and the default file configuration is:

server {
        listen 80;
        listen (::):80;

        root /var/www;

        index index.html index.htm;

        server_name _;

        location / {
                try_files $uri $uri/ =404;
        }

}

now tarefas.conf:

server {
        listen 80;
        listen (::):80;

        root /var/www/tarefas;

        index index.html index.htm;

        server_name tarefas.home;

        location / {
                try_files $uri $uri/ =404;
        }
        location ~.(css) {
                root /var/www/tarefas/css;
        }
        location ~.(js) {
                root /var/www/tarefas/js;
        }
}

In: http://serverip the file index.html inside www opened (ok, it’s correct), now: http://ipdoservidor/tarefas open the file index.html inside tarefas folder (ok, it’s correct to), but inside tarefas folder has /css and /js with your respective files, but when I try access http://serverip/tarefas the js and css files are trying opened by “https://superuser.com/” (/var/www Ex.: http://serverip/css/style.css instead http://serverip/TAREFAS/css/style.css).

Thanks.

script – scrap SharePoint folders?

I have an old SharePoint (not SharePoint Online) with multiple parent folders by US States and child folders with different Years. Each year has folders by Orders. I want to go check in each Order folder if it has a document named ‘Order History’. Is there a way to bring result in an excel file with Year and Order number if it has the required document present?

In short, Instead of going into each order folder manually want a way to automatically pull data in rows like SQL does. Is it possible? Where does SharePoint stores all these documents data in the back end?

untagged – How to get a website with a custom password entry form in index.php to prohibit access to all other files and folders

I know it’s not exactly what you want, but I think your GoDaddy technician is onto the best, easiest solution. What I’d do is put all your secure files into a separate directory and link to that directory from your unsecured index.php file. You can include a note telling your users what username to use if you only want them to have to remember a password. Essentially then you’d have a site like this:

For example, in https://my_family.com/index.php:

<p>
<a href="https://my_family.com/secure/first.html">Login Here with the username "Guest"</a>
</p>

Then in https://my_family.com/secure/ you’d have your .htaccess file as described by your GoDaddy tech which would be something like:

Authtype Basic
AuthName "WHATEVER_YOU_WANT_HERE"
AuthUserFile /path/to/password/file
Require valid-user

Unauthenticated users would thus be able to view and visit your index.php page, but they’d then need the correct password to access your sensitive content. Granted, they’d have the ugly login form you’re talking about, but it would be quick, easy and secure.

That said, if you really want to have a custom form, it’s going to add a whole lot of complexity. But there are a number of approaches you could take. Here’s one:

  1. Secure your sensitive files by putting them into a separate directory as above — or better yet, take them out of the DocumentRoot completely.
  2. Create a script that allows for authentication and that reads and outputs the secure files to authenticated users upon request.

You could do something like this:

<?php

class MySecurePage
{
    const PASSWORD = 'YOUR_PASSWORD_HERE';
    const SECURE_PATH = '/path/to/secure/folder';

    /**
     * Run the page
     */
    public static function run()
    {
        session_start();
        if (isset($_GET('directory'))) {
            $directory = $_GET('directory');
        } else {
            $directory = "https://webmasters.stackexchange.com/";
        }

        if (isset($_GET('file'))) {
            // Regularize forward slashes and remove any periods in the path which could otherwise be a security risk
            $file = $_GET('file');
        } else {
            $file = '';
        }

        if (isset($_GET('log_out'))) {
            $_SESSION('logged_in') = false;
            echo "Logged Out";
        } elseif (!self::checkLogin() && !self::checkPassword()) {
            self::showLoginForm();
        } elseif ($file) {
            self::readFile($directory, $file);
        } else {
            self::showFiles($directory);
        }
        return true;
    }

    /**
     * Returns TRUE if the user is logged in.
     *
     * @return bool
     * 1     */
    public static function checkLogin()
    {
        if (isset($_SESSION('logged_in'))) {
            return $_SESSION('logged_in');
        } else {
            return false;
        }
    }

    /**
     * If the user submitted the correct password, sets the logged_in SESSION variable and refreshes the page to complete
     * the log in process.
     *
     * @return bool
     */
    public static function checkPassword()
    {
        if (isset($_POST('password')) && $_POST('password') === self::PASSWORD) {
            $_SESSION('logged_in') = true;
            header('Location: ' . self::getBaseUrl());
            exit;
        } else {
            return false;
        }
    }

    /**
     * Show the login form
     *
     * @param bool $hasError
     */
    public static function showLoginForm($hasError = false)
    {
        if (isset($_POST('password'))) {
            echo "<div>Invalid password</div>";
        }
        ?>
        <form action="" method="post">
            <label for="password">Password:</label>
            <input type="password" name="password" id="password">
            <input type="submit" name="submit" value="Submit">
        </form>
        <?php
    }

    /**
     * Shows a list of the files/directories in a given directory
     *
     * @param $directory
     * @throws Exception
     */
    public static function showFiles($directory)
    {
        // Filter out hidden files. You may also prefer to only show certain types of files (e.g, HTML files)
        $secureDirectory = self::getSecurePath($directory);
        $files = preg_grep('/^((^.))/', scandir($secureDirectory));

        echo "<div>Current Directory: $directory</div>";


        if ($files) {
            echo '<ul>';
            foreach ($files as $file) {
                if (is_dir($secureDirectory . "https://webmasters.stackexchange.com/" . $file)) {
                    printf('<li><a href="%s">%s</a></li>', self::getBaseUrl() . '?directory=' . urlencode($directory . $file), htmlentities($file));
                } else {
                    printf('<li><a href="%s?file=%s&directory=%s">%s</a></li>', self::getBaseUrl(), urlencode($file), urlencode($directory), htmlentities($file));
                }
            }
            echo '</ul>';
        } else {
            echo 'No files';
        }
    }

    /**
     * Outputs the contents of a file
     *
     * @param $directory
     * @param $file
     */
    public static function readFile($directory, $file)
    {
        // As above, regularize forward slashes and remove any periods in the path which could otherwise be a security risk
        $fullPath = preg_replace('#(/)+#', "https://webmasters.stackexchange.com/", "https://webmasters.stackexchange.com/" . self::SECURE_PATH . "https://webmasters.stackexchange.com/" . $directory . "https://webmasters.stackexchange.com/") . str_replace("https://webmasters.stackexchange.com/", '', $file);
        if (is_file($fullPath)) {
            readfile($fullPath);
        } else {
            echo "Not a file ";
            echo $fullPath;
        }
    }

    /**
     * Gets the base URL for the current web page
     *
     * @return string
     */
    public static function getBaseUrl()
    {
        return 'https://' . $_SERVER(HTTP_HOST) . strtok($_SERVER("REQUEST_URI"), '?');
    }

    /**
     * Builds a secure path for the given directory and file. Ensures that the directory or file is located within the
     * secure path so that malicious users cannot access files they should have access to
     *
     * @param $directory
     * @param null $file
     * @return bool|string
     * @throws Exception
     */
    public static function getSecurePath($directory, $file = null)
    {
        // First let's regularize the forward slashes in the directory
        $directory = preg_replace('#/+#', "https://webmasters.stackexchange.com/", "https://webmasters.stackexchange.com/" . self::SECURE_PATH . "https://webmasters.stackexchange.com/" . $directory . "https://webmasters.stackexchange.com/");
        // Next get rid of all forward slashes in the file
        $file = str_replace("https://webmasters.stackexchange.com/", '', $file);

        $result = realpath($directory . $file);

        // Make sure that the directory/file falls within the secure directory
        if (strpos($result, realpath(self::SECURE_PATH)) !== 0) {
            throw new Exception("Invalid path");
        } else {
            return $result;
        }
    }
}

MySecurePage::run();

Please note that the script above is only meant to be an example. It’s not very usable; it hasn’t been thoroughly tested; and it may well introduce security vulnerabilities!

Despite the above example, I’d seriously suggest you consider the first option even if it’s not exactly what you want. It’s not only much easier to implement, but it’s also likely to be more secure. For instance, in the script I included, I essentially hard-coded a plaintext password into the file, which is a big no-no from a security practice. (I suppose you could do something like hash the password to make it a little more secure, but even this is far less than ideal!) As another example, it would also be relatively easy to accidentally allow authenticated users to access files you don’t even want them to be able to access if you’re not careful.

I guess the question you need to decide is how much time and effort you’re willing to spend just to have a prettier login form.

home screen folders – What do you access the Work Profile files via USB with the PC

I am tried to modify the files on the USB via PC that the computerised workings are appeared as you don’t show the files by something.

The device is always when don’t modify the files when nothing the folders were not shown, but the migrated tankings are seen as you will display the files and folders.

An story is that the things are tanked in the world’s code you here not seen, but the PC is that the code is back on and the mobile will does modify the workers called Work Profile as you have seen as the folder.

Thanks for your support.

need some help figuring why email i send goes into people’s spam folders

I purchased a domain name 3-4 years ago for my business. I have a single email address associated with that domain that I use to email people. I have never sent out bulk emails, spam, email adverts etc etc. My email are always one on one, or occasionally with a few people CC’ed, but it’s always with people I am having an ongoing conversation with. It’s never anyone i blindly email trying to drum up business or anything.

But a lot of people have been telling me that emails i send them wind up in their junk folder. And it’s not confined to one email provider – i’ve had gmail, yahoo and office 360 people all tell me my mail went into their spam folder. I would really like to figure out why this is happening.

I use gmail to send messages. I have my own server which receives the messages and then forwards them to my gmail inbox.

I’ve used mxtoolbox.com to check my domain and it’s not showing up as blacklisted anywhere. I have an SPF record on the domain which authorizes gmail to send the email. In an effort to try and track this down I also just added a dmarc record to the domain.

The only thing I can think of right now is:

  1. someone else is spoofing email from my domain and people are marking those spoofed messages as spam, which is causing the legit messages to get flagged as spam.
  2. something about the format of my emails is triggering them to be flagged as spam.

The only real formatting i have is my email signature, which looks like this:

FirstName LastName
mydomain.com
me@mydomain.com
212 555 1212

Thumbtack | Facebook

Anyone have any suggestions on any other steps I can do to figure out why these email providers are flagging my messages as spam and how to prevent it from happening?

Unique accesses on tree folders structure

I have a structure in a list that looks like below:

List 1

  • SubList in a Folder 1.a
  • SubList in a Folder 1.b
  • SubList in a Folder 1.c
  • SubList in a Folder 1.z

I set unique permissions (using groups) for List 1 and every SubList in a Folder 1.a, 1.b etc as they don’t need to see the other content. Every list and sublist has unique permissions.

I need a way to cut the edit access to all the users at once (for all Folders/sublists) when is the cut-off date, so they don’t edit the lists while we’re closing the books.

I realized that even if i remove their access from List 1, because they have individual unique permissions to edit the sublist, they can still edit the sublists.

Is there a way to lock edit for all sublists in folders for all users at once?

Many thanks!

Are there any files or folders (apart from robots.txt and favicon.ico) which MUST go in the root directory?

I think it may be difficult to get an exhaustive list of all possible files which need to live at the root. For one thing, different content management systems may place various files at the root, while others may place those elsewhere, so it depends on what platform you’re using.

Generally, your index.html file will be found at the root, but keep in mind that your root is still a folder, which may have different names, depending on your web host or your CMS.

Then, you have changing standards. The sitemap.xml file used to be commonly be placed at the root, but these days many CMS’s like WordPress (via plugins like Yoast) allow for creation of a sitemap_index.xml file, which then leads to a list of sitemaps broken down by content type. Sometimes, they all live at the root; other times, they’re in a directory. Having them in a directory is okay, as long as the sitemap index file is at the root and the search bots can easily find and crawl that directory. Thus, the sitemap.xml file may not exist on a website at all anymore, replaced by a (slightly) more complex sitemap information architecture. More on WordPress XML sitemaps here.

And then you have specific use cases. If your website is a publisher and actively sells inventory for ads to display on, you need an ads.txt file. This file should be at the root. If you’re an ad exchange or an SSP (sell side platform), you need a sellers.json file, which should also live at the root. Read more about ads.txt and sellers.json.

Perhaps the best way to go about it is, learn more about your CMS, figure out the functionality you’re looking for, and follow the standard, where the documentation will tell you where the crucial files should live.

Google File Stream: How can I access to "shared with me" folders?

Obviously, shared with me folders are visible on the google drive web, but I cannot find them on my local google file stream. One suggestion online was to create a shortcut on my own google drive, but it didn’t work. Any idea? Thanks!

apache 2.4 – write .htaccess to make all the files n folders present in a folder to be served on my website when in maintenance mode (403 (forbidden mode))?

I have the following file structure in cPanel

web_root_folder_
               |____.neverDelete/_____
               |                     |_____img/logo-30.png
               |                     |_____js/error-page.js
               |                     |_____css/error-page.css
               |
               |____403.shtml
               |
               |____.htaccess

I wanted to write .htaccess with some code that puts the website in ‘maintenance mode’.
So…I wrote 403.shtml page, which uses external css, javascript and images stored in .neverDelete folder.

I wrote the following code in .htaccess

# The WORKING CODE (Too Long 😑)
Deny From All
<FilesMatch 404-layout.min.css>
    Allow From All
</FilesMatch>
<FilesMatch logo-small-transparent-30.png>
    Allow From All
</FilesMatch>
<FilesMatch error-page.js>
    Allow From All
</FilesMatch>

This code successfully worked. It implemented a 403(forbidden) for all files except the 3 files (mentioned in .htaccess file)

but, I want to make all the files n folders present in .neverDelete/ to be served on my website when in maintenance mode.

So i visited http://httpd.apache.org/docs/1.3/mod/core.html#directory to get help. I wrote the code in .htaccess below that actually gave in a 500(server error).

# WRONG CODE
Deny From All
<Directory .neverDelete/ >
    Allow From All
</Directory>

I dont know why it did not worked. Please Suggest how can I make all the files n folders present in .neverDelete/ to be served on my website when in maintenance mode.

sharepoint online – View All Uppermost Shared Folders

I’m not sure exactly how to describe what I’m looking to do, so this title may not be clear, and I may go overboard with the example, but here goes:

I’m currently looking to migrate my company’s files to SharePoint Online from another service we’ve been using for a few years (ShareFile from Citrix). The vast majority of our company files are shared with all employees, one or two folders are shared with all our clients, and there are a few folders that we share individually with each of our clients (monthly reports, day-to-day shared files, etc.) For client access with our current setup in ShareFile, we have clients login to the web interface, and they can see the highest level of all folders shared with them. Here’s an example of what some of our setup looks like:

  • Client Folders
    • Client 1
      • Client 1 Docs Shared (shared with Client 1)
      • Sign-on Info
    • Client 2
      • Client 2 Docs Shared (shared with Client 2)
      • Sign-on Info
  • Reports
    • Client 1 Month-end Reports (shared with Client 1)
      • August 2020 Reports (shared with Client 1 by inheritance)
      • July 2020 Reports (shared with Client 1 by inheritance)
      • etc.
    • Client 2 Month-end Reports (shared with Client 2)
      • August 2020 Reports (shared with Client 2 by inheritance)
      • July 2020 Reports (shared with Client 2 by inheritance)
      • etc.
  • Resources
    • In-house Resources
    • System Resources (shared with all clients)
    • Training Resources (shared with all clients)

So, in ShareFile, someone from Client 1 would login, and they would see the following folders:

  • Client 1 Docs Shared (the highest-level folder shared with them from “Client Folders,” even though it’s three levels deep)
  • Client 1 Month-end Reports (the highest-level folder shared with them from “Reports”—only this highest level folder shows, not all the shared subfolders)
  • System Resources (the highest-level folder shared with them from “Resources”)
  • Training Resources (another highest-level folder shared with them from “Resources”)

I figured it would be easy to replicate this in SharePoint Online, where we could just give clients a link to our main document library, and they would easily be able to see the folders within that are shared with them … but it’s turning out not to be easy at all.

We could give each client separate links to the folders that are shared with them, but that’s pretty messy and inconvenient, and I would like the clients to be able to view all their folders within a single document list so I can make it part of a communication site page that can act as a sort of landing page for our clients, where we can post news and give them login access to our primary software system, all in one place.

I’ve had a variety of ideas about how I might manage things, but I’m hesitant to completely rearrange our folder structure so that our staff isn’t confused, and a pure metadata approach isn’t going to fly because 1) many of both our staff and clients are just too entrenched in folder-style organization to handle this without me becoming a villain and 2) I want to use OneDrive to give File Explorer access to files rather than only web interface, and custom metadata doesn’t seem to be something that can flow through there.

So, I’m hoping someone might know a fairly straightforward way to provide the view like I described or have a better idea about how I can manage the situation.