design templates – Best practices for getting a file in both directories. (Ex: examples for several elements.)

Sorry for the title. I couldn't be clear.

I'm trying to develop a CSS library where I stylize each element and give each one an example HTML page.

However, I came across these two elements (

) whose examples should be shared as they are always used together. Here is the sample code:

It was a bright cold day in April, and the clocks were striking thirteen.

(Code by MDN contributors, page link above.)

So, to demonstrate these two elements, I have to put the examples in two directories that look like this:

+-- cite
    +-- index.css
    +-- index.html
+-- blockquote
    +-- index.css
    +-- index.html

So, I am generating duplicate code and I really want to get by because I need to update these examples in the future when, say, changes to the MDN documentation. And of course, there is more.

I researched the StackExchange network and was surprised not to find a question on it, so I must have been using the wrong keywords. Please suggest.

As this question is not specific to HTML, I do not add the tags.

Therefore, please help me on the best way to make a file appear in two directories. If this is not possible, perhaps limiting it to files of certain types would be helpful.

If this is still impossible, I would like to know the best way to tolerate this. Thanks in advance and again sorry for the title.

finder – Why did the files and directories in the httpd root document change from group to _www?

It's a bit technical:

I do localhost development on Mac High Sierra and others
Mac systems on my local network.

I've been working on content management code in php.
Recently, I had the idea to add my FTP
user account at group _www.

Then, the files placed in the root folder of the document can be transformed.
on the web server's user account by copying the FTP user
Content belonging to the help of a php script. So, that does not change the
group. This is also done by the php script that I develop.

The goal is to be able to edit content both with php and with
FTP user and set "everyone" to no access. Directories are given
0770 perms and 0660 files perms.

BUT I have a directory on the desktop with FTP user ownership
and property of the "personal" group. When I download this content on the
root document using an FTP client the content group is changed
to _www (but not the property of the content). The php script is
not a CLI script that would be executed by the FTP client (Fetch).

I've tried using the search tool to put this content in a document
root. Make sure the group is not changed.

But, since the content has been moved to the root folder of the document,
I want to copy it on the desk. Copy now on the desk
to _www as a group.

THE QUESTION:

Is there a security problem behind the scenes with this approach?
Fortunately, I can use SUDO (or root) to change ownership and group
missions.

More information:

But on a remote site, it will not be possible and I will have
problem with scripts not running chgrp (group of changes) and
NO error is reported. The code just does not do it.

I'm doing development long enough to know how to test to see
that the code is actually tried by the script during its execution.

Thank you for the time and attention

JK

apache2 – The website only displays the list of specific browser directories.

I have a website (https://hypermotions.com) that only displays the list of directories in some browsers (like iOS Chrome App), but most desktop and mobile browsers are able to display correctly the site.

What makes the Apache server have such behavior? The website is built using https://imcreator.com, but of course it's linked (so I have no control over the files, such as .htaccess).

Free directories?

Are there any free directories I can still submit?
Thank you for your help.

http – Is Apache HTTPd able to create intermediate directories during PUTting?

I'm using Gradle and it is able to publish artifacts in repositories using HTTP, but with the only caveat that there is expected a smart repository manager as a target, which manages the creation of intermediate directories if necessary. Without this manager, things fail and this problem has not been solved for years.

Is there anything available in HTTPd as a module or element to simply create these intermediate directories transparently?

I did not find anything and I would not want to implement something for myself, for example. using mod_perl. It might be easier to use a smarter referral manager, but if HTTPd provided something OBS that I just missed, it might be easier.

Thank you!

Several directories will not come out in PowerShell automation

I was writing code to nest and create multiple directories in PowerShell.
I'm using VS Code and I'm pretty new to PowerShell.

The problem is that it will not create multiple directories, but when I do it manually with the same settings, it works.

here is the code

    echo "Folder Creator for FS2019 by Skaytacium, enter the values for upto 32 nested folders! Remember though, 260 characters is the max nesting limit!"

$count = Read-Host -Prompt 'How many folders? Value'
$count = (int16)$count

$sub = Read-Host -Prompt 'Do you wanna nest? y/n'

$namestring = ""

if ($sub -eq "y") {
    echo "The specified folders will be nested"

    while ($count -gt 0) {

        $namestring = $namestring + (Read-Host -Prompt 'Name') + "https://superuser.com/"
        $count--
        echo $namestring

        if ($count -eq 0) {
            mkdir $namestring
        }
    }
}

elseif ($sub -eq "n") {
    echo "The specified folders will be consecutive (in the same dir)"

    while ($count -gt 0) {

        if ($count -gt 1){
            $namestring = $namestring + (Read-Host -Prompt 'Name') + ", "
            $count--
            echo $namestring
        }

        elseif ($count -eq 1) {
            $namestring = $namestring + (Read-Host -Prompt 'Name')
            $count--
            Convert-String -Example $namestring
            echo $namestring
            mkdir $namestring
        }
    }
}

Pause

I know it's not optimized, but I'm new and I just want a quick automation.

I can give newspapers if you want.

thank you,
Sid

SharePoint Server – Trying to Add an HTML Page with Directories

I have seen many contents on stack stack and all those who posted seem to have a common problem. But mine is a little more unique. Yes, I'm trying to download an HTML page, but I also have directories that also have img css js directories. Each directory can contain 2 files per css and js up to 126 in img. My steps consist of:

  1. Site Settings
  2. Site Libraries and Lists
  3. Create new content
  4. + New
  5. Document Library
  6. Provide a name (with View in site navigation
  7. Drag and drop HTML and directories into the document library.
  8. Renamed htm to aspx since explorer

However, the problem is that not only does he not see the paths to directories in the html code, even though I renamed the html to aspx, he still uses the generated aspx file created at the 39, origin when creating the document library. (which I suppose AllItems.aspx). It simply downloads the renamed file as a file rather than viewing it as a real HTML page. I've checked my ribbon and I do not get the options to set my html or the aspx page renamed as the default home page.

I've just tried adding a page to the root / TechSupport directory, but I can only add a single HTML page. The HTML can see the directories there, but I can not add other pages that I have created and which also have their own directories.

I've been trying to add a wiki page and change the code generated by a share point from one of the aspx documents, by placing the code between the generated content, but all that it does is to mess up the formatting of the page. I guess SP does not like HTML pages with directory paths. Unless you have other solutions?

It's weird because SharePoint is looking for a file with specific permissions

bash – Develop the PATH path with a list of directories extracted from a text file, the character & # 39; ~ & # 39; not being extended to $ HOME

The PATH is developed by a list of directories extracted from a text file;

$ cat ~/.path
~/.local/bin
~/W-space/esp/esp-open-sdk/xtensa-lx106-elf/bin
~/W-space/research/recognition/voxforge/bin
~/W-space/research/recognition/voxforge/bin/julius-4.3.1/bin

as follows (the following files are in one of the BASH startup files):

declare -a BATH_ARRAY=($(cat ~/.path)) 2>/dev/null # extend path
for BATH in "${BATH_ARRAY(@)}"
do
case ":${PATH}:" in
  *:${BATH}:*) ;;
  *) PATH=${PATH}:$BATH && export PATH;;
esac
done

Basic iteration on the array of PATH expansion entries extracted from the ~/.path file (above). Inside the ~/.local/bin (which is in the PATH) there is a bl script I am able to invoke, as follows:

$ ls -l ~/.local/bin/bl
-rwxr-xr-x 1 romeo romeo 6304 Nov 17 09:06 /home/romeo/.local/bin/bl*
$ bl 1 #no error!
$

However, some side effects have been briefly discussed in the environment & # 39; sh & # 39; not respecting the PATH extensions, the local PATH variable of the user is not effective? question and include symptoms such as the following:

$ bl 1
$ sh -c 'bl 1'
sh: bl: command not found
$
$ bl 1
$ whereis bl
bl:
$

The consensus was that the character '& # 39; should be extended to the user's HOME before the PATH expansion. How to achieve it, while maintaining an external file as a directory source for PATH extension? Where does the problem lie with the current approach? Help very appreciated 🙂

I will submit your website to 500 directories, very quickly for $ 5

I will submit your site to 500 directories, very quickly

Give me to your website, I will submit your site to 500 directories in 1 day, FAST service. Give only the address of your website.After posting, I will give you evidence..So you will pay me …..

.

apache 2.4 – Exclude directories with ProxyPassMatch

I'm trying to exclude multiple folders from a reverse proxy.
It works if I use:

ProxyPass /folder1 !
ProxyPass /folder2 !
ProxyPass / https://www.example.org
ProxyPassReverse / https://www.example.org

But I want to exclude the folder using ProxyPassMatch in order to have everything in a regular expression:

ProxyPassMatch ^(/folder1|/folder2) !
ProxyPass / https://www.example.org
ProxyPassReverse / https://www.example.org

The only problem is that the regular expression does not work.