Time Complexity help for Hashmaps using AVL Trees and separate chaining

Cheers, I am implementing a hashmap using separate chaining and AVL Trees as buckets but I am having some trouble proving some time complexities. I have to find the worst and average time complexities and also amortized vs real time complexities. So basically I have 4 different complexities combinations (average time amortized, average time real, worst case amortized, worst case real time). My main problem is that I don’t know in which combination I should count the rehashing that I do.
I am now gonna explain some more things about my approach:

When inserting an item I first hash its key and insert in the tree at the hashed position. That insertion should probably take O(log(k)) complexity where k <= n, because k is the number of values inside that bucket, but that makes that O(logn). Now my problem is with the rehash. I set the max load factor for the hash map to 0.9, which is the optimal for the separate chaining approach.

In the rehash:

First I allocate a new hash map with double capacity, I then create the trees for each position of that new capacity hash map, so I make one iteration for the hashmap capacity. I then traverse the old hashmap, I visit every tree and every node inside it and I re-add it into the hashmap using insertion ( so I call hashmap_insert again) , so it hashes to the correct position. However, when traversing the trees, I first get the first node of that tree, and for every node in that tree I call a function called tree_next, which given a node returns the next node in that tree, taking O(logn) time, where n is the number of elements in that tree. However, because we keep the trees relatively “short”, then we can suppose that it’s O(1). When is that though ? When we study the amortized time, or the average ? Or when studying their combination (average time-amortized)?

I would love some help! Thanks!

Are creating separate Yammer communities the best approach to follow, to have unique discussions

I am working on a project for a customer which have 12 offices, and they need to have separate discussion for each office, where users inside other offices should not view the office conversation. so i am planning to create 12 Yammer communities for the 12 offices, and define each community to be private, so is my approach valid?

sql server – How do I use insert data into a sqlserver table from a python socket using .split() to separate entries and executemany with a variable?

I am trying to insert rfid tag reads from a socket using python into a sqlserver database. since im using a tcp socket, the data will sometimes combine and put multiple entries into a single row and i need each read to be a different row. i have combed through multiple sites and the closest thing that i could find was this Python Socket is receiving inconsistent messages from Server but i wasnt able to make it work. i am able to use len to separate them, but i have no way of possibly knowing how many reads will be combined together. here is my current code (section that is commented out is using len)

while 1: 
def data_entry():
        c.execute('INSERT INTO ORFinishing(Tag) VALUES(?)',(data2))
       
        conn.commit()
    
data = s.recv(1024)

data2= data.decode("utf-8").rstrip('n')
print(data2)


if len(data2) ==24:

    print("correct")
    
    data_entry()
    del data2
    
    
else:
    print("wrong")
    data3 = str(data2)
    data4 = data3.split('n')
    print(data4)
    
    
    def many():
        c.executemany('INSERT INTO ORFinishing(Tag) VALUES(?)', data4)
        conn.commit()
        
        many()
    
    # one,two = data2(:24), data2(-24:)
        
    
    # if len(one)==24:
    #     def data_entry1():
    #         c.execute('INSERT INTO ORFinishing(Tag) VALUES(?)',(one))
       
    #         conn.commit()
    
    #     data_entry1()
        
    # if len(two)==24:
    #     def data_entry2():
    #         c.execute('INSERT INTO ORFinishing(Tag) VALUES(?)',(one))
       
    #         conn.commit()
    
    #     data_entry2()
        

I have seen several instances of people using executemany but it was always manually entering a tuple or list. I also tried turning it into a tuple. here is the output that i get.

runfile(‘C:/a/tcpsocket.py’, wdir=’C:/a’)
623037353035000000000000
correct
623031323638000000000000
623031323734000000000000
623033323830360000000000
wrong
(‘623031323638000000000000’, ‘623031323734000000000000’, ‘623033323830360000000000’)

but it doesnt put any of the data from the string split (wrong section) into the database. for the case above, it only had one read in the database. using len does put it into the database but it could be 5 reads combined or 100 reads combined so its not really a viable option.

dnd 5e – Are the Nine Hells separate planes or a single plane?

The Dungeon Master’s Guide lists the Nine Hells as a single plane in Chapter 2: Creating a Multiverse. It also often mentions the home plane of devils, without ever specifying a level for any of the devils.

Mordenkainen’s Tome of Foes counts the Nine Hells as a single plane. In Chapter 1 The Blood War:

To the good fortune of the rest of the multiverse, almost all the
battles in the Blood War take place in the Abyss and the Nine
Hells
. Whether by cosmic chance or the design of some unknown power,
the dark waters of the Styx provide passage between the two
planes
, but pathways to other realms are at best fleeting and
unreliable.

(emphasis mine).

Descent into Avernus also refers to the Nine Hells as a single plane. In Chapter 3 Pervasive Evil:

Evil pervades the Nine Hells, and visitors to this plane feel its
influence.

…the creature’s alignment reverts to normal after one day spent on a plane other than the Nine Hells.

set theory – Can we separate the almost-disjointness sunflower numbers?

This question concerns a new cardinal characteristic of the
continuum that arose out of issues in my answer to the question,
Sunflowers in maximal almost disjoint
families
.

A family $cal A$ of infinite subsets of $omega$ is almost
disjoint
, if any two members of the family have finite
intersection. Such a family is a maximal almost disjoint family
if it cannot be extended to a larger almost disjoint family.

A $Delta$-system, also known as a sunflower, is a family of
sets with all pairs having the same pairwise intersection.

In his earlier question, Dominic had asked whether every maximal
almost disjoint family must contain an infinite sunflower. In the
general case, this seems still to be open, but my answer there
shows that under the
continuum hypothesis, there is a maximal almost disjoint family
containing no sunflowers even of size $3$. Indeed, the construction
there shows that there is a maximal almost disjoint family $langle
A_alphamidalpha<omega_1rangle$
such that every $A_alpha$ has
different intersections with every earlier $A_beta$, for
$beta<alpha$. This property implies that there can be no
sunflower of size $3$ in this family. (But notice by a simple
pigeon-hole argument that it will be impossible to extend this
stronger property to enumerations beyond $omega_1$.)

My questions concern the property of almost disjoint families that
are maximal with respect to the property of not containing any
sunflower of a certain size.

Question 1. If an almost disjoint family of infinite subsets of
$omega$ is maximal amongst almost disjoint families with respect
to the property of not containing a sunflower of size $3$, is it a
maximal almost disjoint family?

And more generally, I ask the same for sunflowers of any particular size.

The question leads naturally to new cardinal characteristics of the
continuum. Namely, let us define almost-disjointness sunflower
number
, officially denoted $frak{a}_{kappa}^Delta$, but let me immediately drop the superscript and write just $frak{a}_kappa$, to be the size of the smallest
almost-disjoint family that is maximal among almost-disjoint
families with respect to the property of not containing a sunflower
of size $kappa$. (We consider only $kappageq 3$.)

The construction on my
other answer shows that $omega_1leqfrak{a}_{kappa}$.

Question 2. Can we separate these various cardinal characteristics $frak{a}_kappa$ from each other, and from the almost-disjointness number $frak{a}$?

For example, is it consistent with ZFC that
$frak{a}_{3}<frak{a}$? This would be a strong refutation of
question 1. Is it consistent that
$frak{a}_{3}neqfrak{a}_{4}$?

At first I had though it was clear that $frak{a}_{kappa}leqfrak{a}$, the
almost-disjointness number, which is the smallest size of any
maximal almost disjoint family. But upon reflection, this no longer seems clear to me, since perhaps there could be a small maximal almost disjoint family, but it contains a lot of sunflowers, and the smallest maximal sunflower-free family might be larger. Or strictly smaller, since a maximal sunflower-free family might not be a maximal almost disjoint family. I had similarly expected that if $kappa<lambda$, then there should
be some trivial provable relation between $frak{a}_{kappa}$ and
$frak{a}_{lambda}$. But unless I am mistaken, this now also doesn’t seem to be immediate.

Question 3. What are the provable relations between $frak{a}_kappa$, $frak{a}_lambda$, and $frak{a}$ when $kappa<lambda$?

Can we say something even about the relation of $frak{a}_{3}$ and $frak{a}_{4}$, or their relation to $frak{a}$?

programming languages – Can SE be divided into several separate career paths with separate skillsets?

programming languages – Can SE be divided into several separate career paths with separate skillsets? – Software Engineering Stack Exchange

unity – How do I split an anim into separate takes?

I can split the animations of an FBX file into individual takes by script.

However, I have not found a way to do that with .anim files.

Is it not possible to split .anim files into individual takes?

ps:
This is the script that I use to split an FBX files into individual takes.
However, I can’t use the same script for anim files because there is no model importer for an anim file:

ModelImporter nModelImporter = (ModelImporter)AssetImporter.GetAtPath(sPathOfFBXFileInAssetFolder);

nModelImporter.clipAnimations = (ModelImporterClipAnimation())nList.ToArray(typeof(ModelImporterClipAnimation));

demanding a separate Muslim

imposed salt tax with the 400 km (250 mi) Dandi Salt March in 1930, and later in calling for the British to Quit India in 1942. He was impri… | Read the rest of https://www.webhostingtalk.com/showthread.php?t=1848667&goto=newpost

macos – Export all images in a burst photo in Photos to separate images on my laptop

I’ve looked at all the items listed at the end, and still can not find any instructions that match what I see on my laptop in terms of exporting to files on my hard drive.

Question: I have a burst with 18 photos that were transferred to the Photos.app on my MacBook with macOS 10.15.7 from my iPhone 6. I want to export all of them to 18 separate images on my hard drive. Is this possible without going to “the cloud” first or installing more software?

can't export all images from a burst photo in Photos.app in macOS

above: I’ve selected all 18 images by going through them using the right-left arrows to scroll through the row at the bottom and clicking each image so that a check mark appeared in the bottom right of every image. Then I clicked the blue Done at the top right of the display. I never saw a “keep all” button mentioned in some Apple tutorials but perhaps those were for previous versions.

below:: The appearance is unchanged in the main photos window, and I only have options to export 1 photo. If I go back to the burst selector feature (in the first screen shot) and select a few items, click done, then come back here, I still don’t have the option to export anything other than one photo.

No matter what I do I can’t get (for example) the “export GIF” feature to work either. (note: I want the individual photos, the GIF feature being blank indicates that I’m still missing some step).

can't export all images from a burst photo in Photos.app in macOS

Apple:

Here:

DreamProxies - Cheapest USA Elite Private Proxies 100 Private Proxies 200 Private Proxies 400 Private Proxies 1000 Private Proxies 2000 Private Proxies ExtraProxies.com - Buy Cheap Private Proxies Buy 50 Private Proxies Buy 100 Private Proxies Buy 200 Private Proxies Buy 500 Private Proxies Buy 1000 Private Proxies Buy 2000 Private Proxies ProxiesLive Proxies-free.com New Proxy Lists Every Day Proxies123